Skip to main content

2024 | OriginalPaper | Buchkapitel

Minimum Kernel Discrepancy Estimators

verfasst von : Chris J. Oates

Erschienen in: Monte Carlo and Quasi-Monte Carlo Methods

Verlag: Springer International Publishing

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

For two decades, reproducing kernels and their associated discrepancies have facilitated elegant theoretical analyses in the setting of quasi Monte Carlo. These same tools are now receiving interest in statistics and related fields, as criteria that can be used to select an appropriate statistical model for a given dataset. The focus of this article is on minimum kernel discrepancy estimators, whose use in statistical applications is reviewed, and a general theoretical framework for establishing their asymptotic properties is presented.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Anhänge
Nur mit Berechtigung zugänglich
Fußnoten
1
i.e. additional levels of differentiability in directions that are axis-aligned.
 
Literatur
1.
Zurück zum Zitat Akaike, H.: Information theory and an extension of the likelihood principle. In: Proceedings of the Second International Symposium of Information Theory (1973) Akaike, H.: Information theory and an extension of the likelihood principle. In: Proceedings of the Second International Symposium of Information Theory (1973)
2.
Zurück zum Zitat Aliprantis, C.D., Burkinshaw, O.: Principles of Real Analysis. Academic Press (1998) Aliprantis, C.D., Burkinshaw, O.: Principles of Real Analysis. Academic Press (1998)
3.
Zurück zum Zitat Alquier, P., Gerber, M.: Universal robust regression via maximum mean discrepancy. Biometrika (2023). To appear Alquier, P., Gerber, M.: Universal robust regression via maximum mean discrepancy. Biometrika (2023). To appear
4.
Zurück zum Zitat Anastasiou, A., Barp, A., Briol, F.X., Ebner, B., Gaunt, R.E., Ghaderinezhad, F., Gorham, J., Gretton, A., Ley, C., Liu, Q., Mackey, L., Oates, C.J., Reinert, G., Swan, Y.: Stein’s method meets statistics: a review of some recent developments. Stat. Sci. 38(1), 120–139 (2023)MathSciNetCrossRef Anastasiou, A., Barp, A., Briol, F.X., Ebner, B., Gaunt, R.E., Ghaderinezhad, F., Gorham, J., Gretton, A., Ley, C., Liu, Q., Mackey, L., Oates, C.J., Reinert, G., Swan, Y.: Stein’s method meets statistics: a review of some recent developments. Stat. Sci. 38(1), 120–139 (2023)MathSciNetCrossRef
5.
Zurück zum Zitat Arjovsky, M., Chintala, S., Bottou, L.: Wasserstein generative adversarial networks. In: Proceedings of the 34th International Conference on Machine Learning (2017) Arjovsky, M., Chintala, S., Bottou, L.: Wasserstein generative adversarial networks. In: Proceedings of the 34th International Conference on Machine Learning (2017)
6.
Zurück zum Zitat Barp, A., Briol, F.X., Duncan, A., Girolami, M., Mackey, L.: Minimum Stein discrepancy estimators. In: Proceedings of the 33rd Conference on Neural Information Processing Systems (2019) Barp, A., Briol, F.X., Duncan, A., Girolami, M., Mackey, L.: Minimum Stein discrepancy estimators. In: Proceedings of the 33rd Conference on Neural Information Processing Systems (2019)
7.
Zurück zum Zitat Barp, A., Simon-Gabriel, C.J., Girolami, M., Mackey, L.: Targeted separation and convergence with kernel discrepancies (2022). arXiv:2209.12835 Barp, A., Simon-Gabriel, C.J., Girolami, M., Mackey, L.: Targeted separation and convergence with kernel discrepancies (2022). arXiv:​2209.​12835
8.
Zurück zum Zitat Basu, A., Shioya, H., Park, C.: Statistical Inference: The Minimum Distance Approach. CRC Press (2011) Basu, A., Shioya, H., Park, C.: Statistical Inference: The Minimum Distance Approach. CRC Press (2011)
10.
Zurück zum Zitat Billingsley, P.: Probability and Measure. Wiley (1979) Billingsley, P.: Probability and Measure. Wiley (1979)
11.
Zurück zum Zitat Bińkowski, M., Sutherland, D.J., Arbel, M., Gretton, A.: Demystifying MMD GANs. In: Proceedings of the 6th International Conference on Learning Representations (2018) Bińkowski, M., Sutherland, D.J., Arbel, M., Gretton, A.: Demystifying MMD GANs. In: Proceedings of the 6th International Conference on Learning Representations (2018)
12.
Zurück zum Zitat Bonner, N., Kirschner, H.P.: Note on conditions for weak convergence of von Mises’ differentiable statistical functions. Ann. Stat. 5(2), 405–407 (1977)MathSciNetCrossRef Bonner, N., Kirschner, H.P.: Note on conditions for weak convergence of von Mises’ differentiable statistical functions. Ann. Stat. 5(2), 405–407 (1977)MathSciNetCrossRef
13.
Zurück zum Zitat Briol, F.X., Barp, A., Duncan, A.B., Girolami, M.: Statistical inference for generative models with maximum mean discrepancy (2019). arXiv:1906.05944 Briol, F.X., Barp, A., Duncan, A.B., Girolami, M.: Statistical inference for generative models with maximum mean discrepancy (2019). arXiv:​1906.​05944
14.
Zurück zum Zitat Carmeli, C., De Vito, E., Toigo, A.: Vector valued reproducing kernel Hilbert spaces of integrable functions and Mercer theorem. Anal. Appl. 4(04), 377–408 (2006)MathSciNetCrossRef Carmeli, C., De Vito, E., Toigo, A.: Vector valued reproducing kernel Hilbert spaces of integrable functions and Mercer theorem. Anal. Appl. 4(04), 377–408 (2006)MathSciNetCrossRef
15.
Zurück zum Zitat Chérief-Abdellatif, B.E., Alquier, P.: MMD-Bayes: robust Bayesian estimation via maximum mean discrepancy. In: Symposium on Advances in Approximate Bayesian Inference, pp. 1–21. PMLR (2020) Chérief-Abdellatif, B.E., Alquier, P.: MMD-Bayes: robust Bayesian estimation via maximum mean discrepancy. In: Symposium on Advances in Approximate Bayesian Inference, pp. 1–21. PMLR (2020)
16.
Zurück zum Zitat Chérief-Abdellatif, B.E., Alquier, P.: Finite sample properties of parametric MMD estimation: robustness to misspecification and dependence. Bernoulli 28(1), 181–213 (2022)MathSciNetCrossRef Chérief-Abdellatif, B.E., Alquier, P.: Finite sample properties of parametric MMD estimation: robustness to misspecification and dependence. Bernoulli 28(1), 181–213 (2022)MathSciNetCrossRef
17.
Zurück zum Zitat Chwialkowski, K., Strathmann, H., Gretton, A.: A kernel test of goodness of fit. In: Proceedings of the 33rd International Conference on Machine Learning (2016) Chwialkowski, K., Strathmann, H., Gretton, A.: A kernel test of goodness of fit. In: Proceedings of the 33rd International Conference on Machine Learning (2016)
18.
Zurück zum Zitat Cortes, E.C., Scott, C.: Sparse approximation of a kernel mean. IEEE Trans. Signal Process. 65(5), 1310–1323 (2016)MathSciNetCrossRef Cortes, E.C., Scott, C.: Sparse approximation of a kernel mean. IEEE Trans. Signal Process. 65(5), 1310–1323 (2016)MathSciNetCrossRef
19.
Zurück zum Zitat Davidson, J.: Stochastic Limit Theory: An Introduction for Econometricians. OUP Oxford (1994) Davidson, J.: Stochastic Limit Theory: An Introduction for Econometricians. OUP Oxford (1994)
21.
22.
Zurück zum Zitat Dellaporta, C., Knoblauch, J., Damoulas, T., Briol, F.X.: Robust Bayesian inference for simulator-based models via the MMD posterior bootstrap. In: Proceedings of the 25th International Conference on Artificial Intelligence and Statistics (2022) Dellaporta, C., Knoblauch, J., Damoulas, T., Briol, F.X.: Robust Bayesian inference for simulator-based models via the MMD posterior bootstrap. In: Proceedings of the 25th International Conference on Artificial Intelligence and Statistics (2022)
23.
Zurück zum Zitat Dhariwal, P., Nichol, A.: Diffusion models beat GANs on image synthesis. In: Proceedings of the 35th Conference on Neural Information Processing Systems (2021) Dhariwal, P., Nichol, A.: Diffusion models beat GANs on image synthesis. In: Proceedings of the 35th Conference on Neural Information Processing Systems (2021)
24.
Zurück zum Zitat Dick, J., Kuo, F.Y., Sloan, I.H.: High-dimensional integration: the quasi-Monte Carlo way. Acta Numerica 22, 133–288 (2013)MathSciNetCrossRef Dick, J., Kuo, F.Y., Sloan, I.H.: High-dimensional integration: the quasi-Monte Carlo way. Acta Numerica 22, 133–288 (2013)MathSciNetCrossRef
25.
Zurück zum Zitat Dick, J., Pillichshammer, F.: Digital Nets and Sequences: Discrepancy Theory and Quasi-Monte Carlo integration. Cambridge University Press (2010) Dick, J., Pillichshammer, F.: Digital Nets and Sequences: Discrepancy Theory and Quasi-Monte Carlo integration. Cambridge University Press (2010)
26.
Zurück zum Zitat Donoho, D.L., Liu, R.C.: The “automatic’’ robustness of minimum distance functionals. Ann. Stat. 16(2), 552–586 (1988)MathSciNetCrossRef Donoho, D.L., Liu, R.C.: The “automatic’’ robustness of minimum distance functionals. Ann. Stat. 16(2), 552–586 (1988)MathSciNetCrossRef
27.
Zurück zum Zitat Dunford, N.: Integration of vector-valued functions. Bulletin of the American Mathematical Society, p. 43 (1937) Dunford, N.: Integration of vector-valued functions. Bulletin of the American Mathematical Society, p. 43 (1937)
28.
Zurück zum Zitat Dziugaite, G.K., Roy, D.M., Ghahramani, Z.: Training generative neural networks via maximum mean discrepancy optimization. In: Proceedings of the 31st Conference on Uncertainty in Artificial Intelligence (2015) Dziugaite, G.K., Roy, D.M., Ghahramani, Z.: Training generative neural networks via maximum mean discrepancy optimization. In: Proceedings of the 31st Conference on Uncertainty in Artificial Intelligence (2015)
29.
Zurück zum Zitat Frazier, D.T., Drovandi, C.: Robust approximate Bayesian inference with synthetic likelihood. J. Comput. Graph. Stat. 30(4), 958–976 (2021)MathSciNetCrossRef Frazier, D.T., Drovandi, C.: Robust approximate Bayesian inference with synthetic likelihood. J. Comput. Graph. Stat. 30(4), 958–976 (2021)MathSciNetCrossRef
30.
Zurück zum Zitat Freedman, D.A.: On the so-called “Huber sandwich estimator’’ and “robust standard errors’’. Am. Stat. 60(4), 299–302 (2006)MathSciNetCrossRef Freedman, D.A.: On the so-called “Huber sandwich estimator’’ and “robust standard errors’’. Am. Stat. 60(4), 299–302 (2006)MathSciNetCrossRef
31.
Zurück zum Zitat Genevay, A., Peyré, G., Cuturi, M.: Learning generative models with sinkhorn divergences. In: Proceedings of the 21st International Conference on Artificial Intelligence and Statistics (2018) Genevay, A., Peyré, G., Cuturi, M.: Learning generative models with sinkhorn divergences. In: Proceedings of the 21st International Conference on Artificial Intelligence and Statistics (2018)
32.
Zurück zum Zitat Gneiting, T., Raftery, A.E.: Strictly proper scoring rules, prediction, and estimation. J. Am. Stat. Assoc. 102(477), 359–378 (2007)MathSciNetCrossRef Gneiting, T., Raftery, A.E.: Strictly proper scoring rules, prediction, and estimation. J. Am. Stat. Assoc. 102(477), 359–378 (2007)MathSciNetCrossRef
33.
Zurück zum Zitat Goodfellow, I., Pouget-Abadie, J., Mirza, M., Xu, B., Warde-Farley, D., Ozair, S., Courville, A., Bengio, Y.: Generative adversarial networks. Commun. ACM 63(11), 139–144 (2020)MathSciNetCrossRef Goodfellow, I., Pouget-Abadie, J., Mirza, M., Xu, B., Warde-Farley, D., Ozair, S., Courville, A., Bengio, Y.: Generative adversarial networks. Commun. ACM 63(11), 139–144 (2020)MathSciNetCrossRef
34.
Zurück zum Zitat Gorham, J., Mackey, L.: Measuring sample quality with kernels. In: Proceedings of the 34th International Conference on Machine Learning (2017) Gorham, J., Mackey, L.: Measuring sample quality with kernels. In: Proceedings of the 34th International Conference on Machine Learning (2017)
35.
Zurück zum Zitat Gretton, A., Borgwardt, K.M., Rasch, M.J., Schölkopf, B., Smola, A.: A kernel two-sample test. J. Mach. Learn. Res. 13(1), 723–773 (2012)MathSciNet Gretton, A., Borgwardt, K.M., Rasch, M.J., Schölkopf, B., Smola, A.: A kernel two-sample test. J. Mach. Learn. Res. 13(1), 723–773 (2012)MathSciNet
36.
Zurück zum Zitat Hansen, L.P.: Large sample properties of generalized method of moments estimators. Econometrica, pp. 1029–1054 (1982) Hansen, L.P.: Large sample properties of generalized method of moments estimators. Econometrica, pp. 1029–1054 (1982)
37.
Zurück zum Zitat Hickernell, F.: A generalized discrepancy and quadrature error bound. Math. Comput. 67(221), 299–322 (1998)MathSciNetCrossRef Hickernell, F.: A generalized discrepancy and quadrature error bound. Math. Comput. 67(221), 299–322 (1998)MathSciNetCrossRef
38.
Zurück zum Zitat Hlawka, E.: Funktionen von beschränkter variatiou in der theorie der gleichverteilung. Annali di Matematica Pura ed Applicata 54(1), 325–333 (1961)CrossRef Hlawka, E.: Funktionen von beschränkter variatiou in der theorie der gleichverteilung. Annali di Matematica Pura ed Applicata 54(1), 325–333 (1961)CrossRef
39.
Zurück zum Zitat Hoeffding, W.: A class of statistics with asymptotically normal distribution. Ann. Math. Stat. 19(3), 293–325 (1948)MathSciNetCrossRef Hoeffding, W.: A class of statistics with asymptotically normal distribution. Ann. Math. Stat. 19(3), 293–325 (1948)MathSciNetCrossRef
40.
Zurück zum Zitat Hoeffding, W.: The strong law of large numbers for \({U}\)-statistics. Technical report, North Carolina State University. Department of Statistics (1961) Hoeffding, W.: The strong law of large numbers for \({U}\)-statistics. Technical report, North Carolina State University. Department of Statistics (1961)
41.
Zurück zum Zitat Huber, P.J.: Robust estimation of a location parameter. The Annals of Mathematical Statistics, pp. 73–101 (1964) Huber, P.J.: Robust estimation of a location parameter. The Annals of Mathematical Statistics, pp. 73–101 (1964)
42.
Zurück zum Zitat Hyvärinen, A., Dayan, P.: Estimation of non-normalized statistical models by score matching. J. Mach. Learn. Res. 6(4) (2005) Hyvärinen, A., Dayan, P.: Estimation of non-normalized statistical models by score matching. J. Mach. Learn. Res. 6(4) (2005)
43.
44.
Zurück zum Zitat Kuo, F.Y.: Component-by-component constructions achieve the optimal rate of convergence for multivariate integration in weighted Korobov and Sobolev spaces. J. Complex. 19(3), 301–320 (2003)MathSciNetCrossRef Kuo, F.Y.: Component-by-component constructions achieve the optimal rate of convergence for multivariate integration in weighted Korobov and Sobolev spaces. J. Complex. 19(3), 301–320 (2003)MathSciNetCrossRef
45.
Zurück zum Zitat Lam, C.: High-dimensional covariance matrix estimation. Wiley Interdiscip. Rev.: Comput. Stat. 12(2), e1485 (2020)MathSciNetCrossRef Lam, C.: High-dimensional covariance matrix estimation. Wiley Interdiscip. Rev.: Comput. Stat. 12(2), e1485 (2020)MathSciNetCrossRef
46.
Zurück zum Zitat LeCun, Y., Chopra, S., Hadsell, R., Ranzato, M., Huang, F.: A tutorial on energy-based learning. In: Schölkopf, B., Smola, A.J., Taskar, B., Vishwanathan, S. (eds.), Predicting Structured Data (2007) LeCun, Y., Chopra, S., Hadsell, R., Ranzato, M., Huang, F.: A tutorial on energy-based learning. In: Schölkopf, B., Smola, A.J., Taskar, B., Vishwanathan, S. (eds.), Predicting Structured Data (2007)
47.
Zurück zum Zitat Ledoux, M., Talagrand, M.: Probability in Banach Spaces: Isoperimetry and Processes. Springer Science & Business Media (1991) Ledoux, M., Talagrand, M.: Probability in Banach Spaces: Isoperimetry and Processes. Springer Science & Business Media (1991)
48.
Zurück zum Zitat Li, C.L., Chang, W.C., Cheng, Y., Yang, Y., Póczos, B.: MMD GAN: towards deeper understanding of moment matching network. In: Proceedings of the 31st Conference on Neural Information Processing Systems (2017) Li, C.L., Chang, W.C., Cheng, Y., Yang, Y., Póczos, B.: MMD GAN: towards deeper understanding of moment matching network. In: Proceedings of the 31st Conference on Neural Information Processing Systems (2017)
49.
Zurück zum Zitat Li, Y., Swersky, K., Zemel, R.: Generative moment matching networks. In: Proceedings of the 32nd International Conference on Machine Learning (2015) Li, Y., Swersky, K., Zemel, R.: Generative moment matching networks. In: Proceedings of the 32nd International Conference on Machine Learning (2015)
50.
Zurück zum Zitat Liu, Q., Lee, J., Jordan, M.: A kernelized Stein discrepancy for goodness-of-fit tests. In: Proceedings of the 33rd International Conference on Machine Learning (2016) Liu, Q., Lee, J., Jordan, M.: A kernelized Stein discrepancy for goodness-of-fit tests. In: Proceedings of the 33rd International Conference on Machine Learning (2016)
51.
Zurück zum Zitat Lyne, A.M., Girolami, M., Atchadé, Y., Strathmann, H., Simpson, D.: On Russian roulette estimates for Bayesian inference with doubly-intractable likelihoods. Stat. Sci. 30(4), 443–467 (2015)MathSciNetCrossRef Lyne, A.M., Girolami, M., Atchadé, Y., Strathmann, H., Simpson, D.: On Russian roulette estimates for Bayesian inference with doubly-intractable likelihoods. Stat. Sci. 30(4), 443–467 (2015)MathSciNetCrossRef
52.
Zurück zum Zitat Matsubara, T., Knoblauch, J., Briol, F.X., Oates, C.J.: Robust generalised bayesian inference for intractable likelihoods. J. R. Stat. Soc. Ser. B 84(3), 997–1022 (2022)MathSciNetCrossRef Matsubara, T., Knoblauch, J., Briol, F.X., Oates, C.J.: Robust generalised bayesian inference for intractable likelihoods. J. R. Stat. Soc. Ser. B 84(3), 997–1022 (2022)MathSciNetCrossRef
53.
Zurück zum Zitat Matsubara, T., Knoblauch, J., Briol, F.X., Oates, C.J.: Robust generalised Bayesian inference for intractable likelihoods. J. R. Stat. Soc.: Ser. B 84(3), 997–1022 (2022)MathSciNetCrossRef Matsubara, T., Knoblauch, J., Briol, F.X., Oates, C.J.: Robust generalised Bayesian inference for intractable likelihoods. J. R. Stat. Soc.: Ser. B 84(3), 997–1022 (2022)MathSciNetCrossRef
54.
Zurück zum Zitat Mitrovic, J., Sejdinovic, D., Teh, Y.W.: DR-ABC: Approximate Bayesian computation with kernel-based distribution regression. In: Proceedings of the 33rd International Conference on Machine Learning (2016) Mitrovic, J., Sejdinovic, D., Teh, Y.W.: DR-ABC: Approximate Bayesian computation with kernel-based distribution regression. In: Proceedings of the 33rd International Conference on Machine Learning (2016)
55.
Zurück zum Zitat Mroueh, Y., Li, C.L., Sercu, T., Raj, A., Cheng, Y.: Sobolev GAN. In: Proceedings of the 6th International Conference on Learning Representations (2018) Mroueh, Y., Li, C.L., Sercu, T., Raj, A., Cheng, Y.: Sobolev GAN. In: Proceedings of the 6th International Conference on Learning Representations (2018)
56.
Zurück zum Zitat Mroueh, Y., Sercu, T.: Fisher GAN. In: Proceedings of the 31st Conference on Neural Information Processing Systems (2017) Mroueh, Y., Sercu, T.: Fisher GAN. In: Proceedings of the 31st Conference on Neural Information Processing Systems (2017)
57.
Zurück zum Zitat Mroueh, Y., Sercu, T., Goel, V.: McGAN: mean and covariance feature matching GAN. In: Proceedings of the 34th International Conference on Machine Learning (2017) Mroueh, Y., Sercu, T., Goel, V.: McGAN: mean and covariance feature matching GAN. In: Proceedings of the 34th International Conference on Machine Learning (2017)
58.
Zurück zum Zitat Muandet, K., Fukumizu, K., Sriperumbudur, B., Schölkopf, B.: Kernel mean embedding of distributions: A review and beyond. Found. Trends® Mach. Learn. 10(1–2), 1–141 (2017) Muandet, K., Fukumizu, K., Sriperumbudur, B., Schölkopf, B.: Kernel mean embedding of distributions: A review and beyond. Found. Trends® Mach. Learn. 10(1–2), 1–141 (2017)
59.
Zurück zum Zitat Müller, A.: Integral probability metrics and their generating classes of functions. Adv. Appl. Probab. 29(2), 429–443 (1997)MathSciNetCrossRef Müller, A.: Integral probability metrics and their generating classes of functions. Adv. Appl. Probab. 29(2), 429–443 (1997)MathSciNetCrossRef
60.
Zurück zum Zitat Nietert, S., Goldfeld, Z., Kato, K.: Smooth \( p \)-wasserstein distance: structure, empirical approximation, and statistical applications. In: Proceedings of the 38th International Conference on Machine Learning (2021) Nietert, S., Goldfeld, Z., Kato, K.: Smooth \( p \)-wasserstein distance: structure, empirical approximation, and statistical applications. In: Proceedings of the 38th International Conference on Machine Learning (2021)
61.
Zurück zum Zitat Niu, Z., Meier, J., Briol, F.X.: Discrepancy-based inference for intractable generative models using quasi-Monte Carlo. Electron. J. Stat. 17(1), 1411–1456 (2023)MathSciNetCrossRef Niu, Z., Meier, J., Briol, F.X.: Discrepancy-based inference for intractable generative models using quasi-Monte Carlo. Electron. J. Stat. 17(1), 1411–1456 (2023)MathSciNetCrossRef
62.
Zurück zum Zitat Oates, C.J., Girolami, M., Chopin, N.: Control functionals for Monte Carlo integration. J. R. Stat. Soc. Ser. B 79, 695–718 (2017)MathSciNetCrossRef Oates, C.J., Girolami, M., Chopin, N.: Control functionals for Monte Carlo integration. J. R. Stat. Soc. Ser. B 79, 695–718 (2017)MathSciNetCrossRef
63.
Zurück zum Zitat Pardo, L.: Statistical Inference Based on Divergence Measures. Chapman and Hall/CRC (2018) Pardo, L.: Statistical Inference Based on Divergence Measures. Chapman and Hall/CRC (2018)
64.
Zurück zum Zitat Park, M., Jitkrittum, W., Sejdinovic, D.: K2-ABC: Approximate Bayesian computation with kernel embeddings. In: Proceedings of the 18th International Conference on Artificial Intelligence and Statistics (2016) Park, M., Jitkrittum, W., Sejdinovic, D.: K2-ABC: Approximate Bayesian computation with kernel embeddings. In: Proceedings of the 18th International Conference on Artificial Intelligence and Statistics (2016)
65.
Zurück zum Zitat Schwabik, S., Ye, G.: Topics in Banach Space Integration. World Scientific (2005) Schwabik, S., Ye, G.: Topics in Banach Space Integration. World Scientific (2005)
66.
Zurück zum Zitat Serfling, R.J.: Approximation Theorems of Mathematical Statistics. Wiley (2009) Serfling, R.J.: Approximation Theorems of Mathematical Statistics. Wiley (2009)
67.
Zurück zum Zitat Simon-Gabriel, C.J., Barp, A., Mackey, L.: Metrizing weak convergence with maximum mean discrepancies. J. Mach. Learn. Res. 24, 1–20 (2023)MathSciNet Simon-Gabriel, C.J., Barp, A., Mackey, L.: Metrizing weak convergence with maximum mean discrepancies. J. Mach. Learn. Res. 24, 1–20 (2023)MathSciNet
68.
Zurück zum Zitat Sloan, I.H., Kachoyan, P.J.: Lattice methods for multiple integration: theory, error analysis and examples. SIAM J. Numer. Anal. 24(1), 116–128 (1987)MathSciNetCrossRef Sloan, I.H., Kachoyan, P.J.: Lattice methods for multiple integration: theory, error analysis and examples. SIAM J. Numer. Anal. 24(1), 116–128 (1987)MathSciNetCrossRef
69.
Zurück zum Zitat Sloan, I.H., Woźniakowski, H.: When are quasi-Monte Carlo algorithms efficient for high dimensional integrals? J. Compl. 14(1), 1–33 (1998)MathSciNetCrossRef Sloan, I.H., Woźniakowski, H.: When are quasi-Monte Carlo algorithms efficient for high dimensional integrals? J. Compl. 14(1), 1–33 (1998)MathSciNetCrossRef
70.
Zurück zum Zitat Song, L., Zhang, X., Smola, A., Gretton, A., Schölkopf, B.: Tailoring density estimation via reproducing kernel moment matching. In: Proceedings of the 25th International Conference on Machine Learning (2008) Song, L., Zhang, X., Smola, A., Gretton, A., Schölkopf, B.: Tailoring density estimation via reproducing kernel moment matching. In: Proceedings of the 25th International Conference on Machine Learning (2008)
72.
Zurück zum Zitat Steinwart, I., Christmann, A.: Support Vector Machines. Springer Science & Business Media (2008) Steinwart, I., Christmann, A.: Support Vector Machines. Springer Science & Business Media (2008)
73.
Zurück zum Zitat Sutherland, D.J., Tung, H.Y., Strathmann, H., De, S., Ramdas, A., Smola, A.J., Gretton, A.: Generative models and model criticism via optimized maximum mean discrepancy. In: Proceedings of the 5th International Conference on Learning Representations (2017) Sutherland, D.J., Tung, H.Y., Strathmann, H., De, S., Ramdas, A., Smola, A.J., Gretton, A.: Generative models and model criticism via optimized maximum mean discrepancy. In: Proceedings of the 5th International Conference on Learning Representations (2017)
74.
Zurück zum Zitat Teymur, O., Gorham, J., Riabiz, M., Oates, C.J.: Optimal quantisation of probability measures using maximum mean discrepancy. In: Proceedings of the 24th International Conference on Artificial Intelligence and Statistics (2021) Teymur, O., Gorham, J., Riabiz, M., Oates, C.J.: Optimal quantisation of probability measures using maximum mean discrepancy. In: Proceedings of the 24th International Conference on Artificial Intelligence and Statistics (2021)
75.
Zurück zum Zitat Theis, L., van den Oord, A., Bethge, M.: A note on the evaluation of generative models. In: Proceedings of the 4th International Conference on Learning Representations (2016) Theis, L., van den Oord, A., Bethge, M.: A note on the evaluation of generative models. In: Proceedings of the 4th International Conference on Learning Representations (2016)
76.
Zurück zum Zitat Van der Vaart, A.W.: Asymptotic Statistics. Cambridge University Press (2000) Van der Vaart, A.W.: Asymptotic Statistics. Cambridge University Press (2000)
77.
Zurück zum Zitat Wynne, G., Duncan, A.B.: A kernel two-sample test for functional data. J. Mach. Learn. Res. 23(73), 1–51 (2022)MathSciNet Wynne, G., Duncan, A.B.: A kernel two-sample test for functional data. J. Mach. Learn. Res. 23(73), 1–51 (2022)MathSciNet
78.
Zurück zum Zitat Wynne, G., Kasprzak, M., Duncan, A.B.: A spectral representation of kernel Stein discrepancy with application to goodness-of-fit tests for measures on infinite dimensional Hilbert spaces (2022). arXiv:2206.04552 Wynne, G., Kasprzak, M., Duncan, A.B.: A spectral representation of kernel Stein discrepancy with application to goodness-of-fit tests for measures on infinite dimensional Hilbert spaces (2022). arXiv:​2206.​04552
Metadaten
Titel
Minimum Kernel Discrepancy Estimators
verfasst von
Chris J. Oates
Copyright-Jahr
2024
DOI
https://doi.org/10.1007/978-3-031-59762-6_6