Skip to main content
Erschienen in: Journal of Scientific Computing 1/2024

01.01.2024

A Machine Learning Framework for Geodesics Under Spherical Wasserstein–Fisher–Rao Metric and Its Application for Weighted Sample Generation

verfasst von: Yang Jing, Jiaheng Chen, Lei Li, Jianfeng Lu

Erschienen in: Journal of Scientific Computing | Ausgabe 1/2024

Einloggen

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

Wasserstein–Fisher–Rao (WFR) distance is a family of metrics to gauge the discrepancy of two Radon measures, which takes into account both transportation and weight change. Spherical WFR distance is a projected version of WFR distance for probability measures so that the space of Radon measures equipped with WFR can be viewed as metric cone over the space of probability measures with spherical WFR. Compared to the case for Wasserstein distance, the understanding of geodesics under the spherical WFR is less clear and still an ongoing research focus. In this paper, we develop a deep learning framework to compute the geodesics under the spherical WFR metric, and the learned geodesics can be adopted to generate weighted samples. Our approach is based on a Benamou–Brenier type dynamic formulation for spherical WFR. To overcome the difficulty in enforcing the boundary constraint brought by the weight change, a Kullback–Leibler divergence term based on the inverse map is introduced into the cost function. Moreover, a new regularization term using the particle velocity is introduced as a substitute for the Hamilton–Jacobi equation for the potential in dynamic formula. When used for sample generation, our framework can be beneficial for applications with given weighted samples, especially in the Bayesian inference, compared to sample generation with previous flow models.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Anhänge
Nur mit Berechtigung zugänglich
Fußnoten
1
https://www.csie.ntu.edu.tw/\(\sim \)cjlin/libsvmtools/datasets/binary.html.
 
Literatur
1.
Zurück zum Zitat Ambrosio, L., Gigli, N., Savaré, G.: Gradient Flows: in Metric Spaces and in the Space of Probability Measures. Springer (2005) Ambrosio, L., Gigli, N., Savaré, G.: Gradient Flows: in Metric Spaces and in the Space of Probability Measures. Springer (2005)
2.
Zurück zum Zitat Apte, A., Hairer, M., Stuart, A.M., Voss, J.: Sampling the Posterior: An Approach to Non-Gaussian Data Assimilation. Physica D: Nonlinear Phenomena, 230(1–2), 50–64 (2007) Apte, A., Hairer, M., Stuart, A.M., Voss, J.: Sampling the Posterior: An Approach to Non-Gaussian Data Assimilation. Physica D: Nonlinear Phenomena, 230(1–2), 50–64 (2007)
3.
Zurück zum Zitat Arjovsky, M., Chintala, S., Bottou, L.: Wasserstein generative adversarial networks. In: International Conference on Machine Learning, pp. 214–223 (2017) Arjovsky, M., Chintala, S., Bottou, L.: Wasserstein generative adversarial networks. In: International Conference on Machine Learning, pp. 214–223 (2017)
4.
Zurück zum Zitat Braides, A.: Gamma-Convergence for Beginners, vol. 22. Clarendon Press (2002) Braides, A.: Gamma-Convergence for Beginners, vol. 22. Clarendon Press (2002)
5.
Zurück zum Zitat Brenier, Y., Vorotnikov, D.: On optimal transport of matrix-valued measures. SIAM J. Math. Anal. 52(3), 2849–2873 (2020)MathSciNetCrossRef Brenier, Y., Vorotnikov, D.: On optimal transport of matrix-valued measures. SIAM J. Math. Anal. 52(3), 2849–2873 (2020)MathSciNetCrossRef
6.
Zurück zum Zitat Chen, R.T.Q., Rubanova, Y., Bettencourt, J., Duvenaud, D.K.: Neural ordinary differential equations. Adv. Neural Inf. Process. Syst. 31 (2018) Chen, R.T.Q., Rubanova, Y., Bettencourt, J., Duvenaud, D.K.: Neural ordinary differential equations. Adv. Neural Inf. Process. Syst. 31 (2018)
7.
Zurück zum Zitat Chizat, L., Peyré, G., Schmitzer, B., Vialard, F.-X.: An interpolating distance between optimal transport and Fisher–Rao metrics. Found. Comput. Math. 18(1), 1–44 (2018)MathSciNetCrossRef Chizat, L., Peyré, G., Schmitzer, B., Vialard, F.-X.: An interpolating distance between optimal transport and Fisher–Rao metrics. Found. Comput. Math. 18(1), 1–44 (2018)MathSciNetCrossRef
8.
Zurück zum Zitat Chizat, L., Peyré, G., Schmitzer, B., Vialard, F.-X.: Unbalanced optimal transport: dynamic and Kantorovich formulations. J. Funct. Anal. 274(11), 3090–3123 (2018)MathSciNetCrossRef Chizat, L., Peyré, G., Schmitzer, B., Vialard, F.-X.: Unbalanced optimal transport: dynamic and Kantorovich formulations. J. Funct. Anal. 274(11), 3090–3123 (2018)MathSciNetCrossRef
9.
Zurück zum Zitat Chwialkowski, K., Strathmann, H., Gretton, A.: A kernel test of goodness of fit. In: International Conference on Machine Learning, pp. 2606–2615 (2016) Chwialkowski, K., Strathmann, H., Gretton, A.: A kernel test of goodness of fit. In: International Conference on Machine Learning, pp. 2606–2615 (2016)
10.
Zurück zum Zitat De Giorgi, E.: New Problems on Minimizing Movements. Ennio de Giorgi: Selected Papers, pp. 699–713 (1993) De Giorgi, E.: New Problems on Minimizing Movements. Ennio de Giorgi: Selected Papers, pp. 699–713 (1993)
11.
Zurück zum Zitat Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics (2019) Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics (2019)
12.
Zurück zum Zitat Evans, L.C.: An Introduction to Mathematical Optimal Control Theory Version 0.2. Lecture Notes available at http://math.berkeley.edu/\(\sim \)evans/control.course.pdf (1983) Evans, L.C.: An Introduction to Mathematical Optimal Control Theory Version 0.2. Lecture Notes available at http://​math.​berkeley.​edu/​\(\sim \)evans/control.course.pdf (1983)
13.
Zurück zum Zitat Finlay, C., Jacobsen, J.-H., Nurbekyan, L., Oberman, A.: How to train your neural ODE: the world of Jacobian and kinetic regularization. In: International Conference on Machine Learning, pp. 3154–3164 (2020) Finlay, C., Jacobsen, J.-H., Nurbekyan, L., Oberman, A.: How to train your neural ODE: the world of Jacobian and kinetic regularization. In: International Conference on Machine Learning, pp. 3154–3164 (2020)
14.
Zurück zum Zitat Galichon, A.: A survey of some recent applications of optimal transport methods to econometrics. Econom. J. 20(2), C1–C11 (2017)MathSciNetCrossRef Galichon, A.: A survey of some recent applications of optimal transport methods to econometrics. Econom. J. 20(2), C1–C11 (2017)MathSciNetCrossRef
15.
Zurück zum Zitat Galichon, A.: Optimal Transport Methods in Economics. Princeton University Press (2018) Galichon, A.: Optimal Transport Methods in Economics. Princeton University Press (2018)
16.
Zurück zum Zitat Goodfellow, I., Pouget-Abadie, J., Mirza, M., Xu, B., Warde-Farley, D., Ozair, S., Courville, A., Bengio, Y.: Generative adversarial nets. Adv. Neural Inf. Process. Syst. 27 (2014) Goodfellow, I., Pouget-Abadie, J., Mirza, M., Xu, B., Warde-Farley, D., Ozair, S., Courville, A., Bengio, Y.: Generative adversarial nets. Adv. Neural Inf. Process. Syst. 27 (2014)
17.
Zurück zum Zitat Gorham, J., Mackey, L.: Measuring sample quality with kernels. In: International Conference on Machine Learning, pp. 1292–1301 (2017) Gorham, J., Mackey, L.: Measuring sample quality with kernels. In: International Conference on Machine Learning, pp. 1292–1301 (2017)
18.
Zurück zum Zitat Gretton, A., Borgwardt, K.M., Rasch, M.J., Schölkopf, B., Smola, A.: A kernel two-sample test. J. Mach. Learn. Res. 13(1), 723–773 (2012) Gretton, A., Borgwardt, K.M., Rasch, M.J., Schölkopf, B., Smola, A.: A kernel two-sample test. J. Mach. Learn. Res. 13(1), 723–773 (2012)
19.
Zurück zum Zitat Gulrajani, I., Ahmed, F., Arjovsky, M., Dumoulin, V., Courville, A.C.: Improved Training of Wasserstein GANs. Adv. Neural Inf. Process. Syst. 30 (2017) Gulrajani, I., Ahmed, F., Arjovsky, M., Dumoulin, V., Courville, A.C.: Improved Training of Wasserstein GANs. Adv. Neural Inf. Process. Syst. 30 (2017)
20.
Zurück zum Zitat He, K., Zhang, X., Ren, S., Sun, J.: Deep Residual Learning for Image Recognition. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 770–778 (2016) He, K., Zhang, X., Ren, S., Sun, J.: Deep Residual Learning for Image Recognition. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 770–778 (2016)
22.
Zurück zum Zitat Johnson, R., Zhang, T.: A framework of composite functional gradient methods for generative adversarial models. IEEE Trans. Pattern Anal. Mach. Intell. 43(1), 17–32 (2019)CrossRef Johnson, R., Zhang, T.: A framework of composite functional gradient methods for generative adversarial models. IEEE Trans. Pattern Anal. Mach. Intell. 43(1), 17–32 (2019)CrossRef
23.
Zurück zum Zitat Jordan, R., Kinderlehrer, D., Otto, F.: The variational formulation of the Fokker–Planck equation. SIAM J. Math. Anal. 29(1), 1–17 (1998)MathSciNetCrossRef Jordan, R., Kinderlehrer, D., Otto, F.: The variational formulation of the Fokker–Planck equation. SIAM J. Math. Anal. 29(1), 1–17 (1998)MathSciNetCrossRef
24.
Zurück zum Zitat Kingma, D.P., Ba, J.: Adam: a method for stochastic optimization. In: International Conference on Learning Representations (2015) Kingma, D.P., Ba, J.: Adam: a method for stochastic optimization. In: International Conference on Learning Representations (2015)
25.
Zurück zum Zitat Kingma, D.P., Welling, M.: Auto-encoding variational bayes. In: International Conference on Learning Representations (2014) Kingma, D.P., Welling, M.: Auto-encoding variational bayes. In: International Conference on Learning Representations (2014)
26.
Zurück zum Zitat Kingma, D.P., Welling, M., et al.: An introduction to variational autoencoders. Found. Trends® Mach. Learn. 12(4), 307–392 (2019) Kingma, D.P., Welling, M., et al.: An introduction to variational autoencoders. Found. Trends® Mach. Learn. 12(4), 307–392 (2019)
27.
Zurück zum Zitat Kondratyev, S., Monsaingeon, L., Vorotnikov, D.: A new optimal transport distance on the space of finite Radon measures. Adv. Differ. Equ. 21(11/12), 1117–1164 (2016)MathSciNet Kondratyev, S., Monsaingeon, L., Vorotnikov, D.: A new optimal transport distance on the space of finite Radon measures. Adv. Differ. Equ. 21(11/12), 1117–1164 (2016)MathSciNet
28.
Zurück zum Zitat Kondratyev, S., Vorotnikov, D.: Spherical Hellinger–Kantorovich gradient flows. SIAM J. Math. Anal. 51(3), 2053–2084 (2019)MathSciNetCrossRef Kondratyev, S., Vorotnikov, D.: Spherical Hellinger–Kantorovich gradient flows. SIAM J. Math. Anal. 51(3), 2053–2084 (2019)MathSciNetCrossRef
29.
Zurück zum Zitat Laschos, V., Mielke, A.: Geometric properties of cones with applications on the Hellinger–Kantorovich space, and a new distance on the space of probability measures. J. Funct. Anal. 276(11), 3529–3576 (2019)MathSciNetCrossRef Laschos, V., Mielke, A.: Geometric properties of cones with applications on the Hellinger–Kantorovich space, and a new distance on the space of probability measures. J. Funct. Anal. 276(11), 3529–3576 (2019)MathSciNetCrossRef
30.
Zurück zum Zitat Li, W., Lee, W., Osher, S.: Computational mean-field information dynamics associated with reaction–diffusion equations. J. Comput. Phys., p. 111409 (2022) Li, W., Lee, W., Osher, S.: Computational mean-field information dynamics associated with reaction–diffusion equations. J. Comput. Phys., p. 111409 (2022)
31.
Zurück zum Zitat Li, W., Ryu, E.K., Osher, S., Yin, W., Gangbo, W.: A parallel method for Earth Mover’s distance. J. Sci. Comput. 75(1), 182–197 (2018) Li, W., Ryu, E.K., Osher, S., Yin, W., Gangbo, W.: A parallel method for Earth Mover’s distance. J. Sci. Comput. 75(1), 182–197 (2018)
32.
Zurück zum Zitat Li, Y., Swersky, K., Zemel, R.: Generative moment matching networks. In: International Conference on Machine Learning, pp. 1718–1727 (2015) Li, Y., Swersky, K., Zemel, R.: Generative moment matching networks. In: International Conference on Machine Learning, pp. 1718–1727 (2015)
33.
Zurück zum Zitat Liero, M., Mielke, A., Savaré, G.: Optimal entropy-transport problems and a new Hellinger–Kantorovich distance between positive measures. Inventiones Math. 211(3), 969–1117 (2018)MathSciNetCrossRef Liero, M., Mielke, A., Savaré, G.: Optimal entropy-transport problems and a new Hellinger–Kantorovich distance between positive measures. Inventiones Math. 211(3), 969–1117 (2018)MathSciNetCrossRef
34.
Zurück zum Zitat Liu, Q.: Stein variational gradient descent as gradient flow. Adv. Neural Inf. Process. Syst. 30 (2017) Liu, Q.: Stein variational gradient descent as gradient flow. Adv. Neural Inf. Process. Syst. 30 (2017)
35.
Zurück zum Zitat Liu, Q., Lee, J., Jordan, M.: A kernelized Stein discrepancy for goodness-of-fit tests. In: International Conference on Machine Learning, pp. 276–284 (2016) Liu, Q., Lee, J., Jordan, M.: A kernelized Stein discrepancy for goodness-of-fit tests. In: International Conference on Machine Learning, pp. 276–284 (2016)
36.
Zurück zum Zitat Liu, Q., Wang, D.: Stein variational gradient descent: a general purpose Bayesian inference algorithm. Adv. Neural Inf. Process. Syst. 29 (2016) Liu, Q., Wang, D.: Stein variational gradient descent: a general purpose Bayesian inference algorithm. Adv. Neural Inf. Process. Syst. 29 (2016)
37.
Zurück zum Zitat Metropolis, N., Rosenbluth, A.W., Rosenbluth, M.N., Teller, A.H., Teller, E.: Equation of state calculations by fast computing machines. J. Chem. Phys. 21(6), 1087–1092 (1953) Metropolis, N., Rosenbluth, A.W., Rosenbluth, M.N., Teller, A.H., Teller, E.: Equation of state calculations by fast computing machines. J. Chem. Phys. 21(6), 1087–1092 (1953)
38.
Zurück zum Zitat Monge, G.: Mémoire sur la théorie des déblais et des remblais. Mem. Math. Phys. Acad. Royale Sci., pp. 666–704 (1781) Monge, G.: Mémoire sur la théorie des déblais et des remblais. Mem. Math. Phys. Acad. Royale Sci., pp. 666–704 (1781)
39.
Zurück zum Zitat Müller, T., McWilliams, B., Rousselle, F., Gross, M., Novák, J.: Neural importance sampling. ACM Trans. Graphics (ToG) 38(5), 1–19 (2019)CrossRef Müller, T., McWilliams, B., Rousselle, F., Gross, M., Novák, J.: Neural importance sampling. ACM Trans. Graphics (ToG) 38(5), 1–19 (2019)CrossRef
40.
Zurück zum Zitat Onken, D., Fung, S.W., Li, X., Ruthotto, L.: OT-Flow: Fast and accurate continuous normalizing flows via optimal transport. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 35, pp. 9223–9232 (2021) Onken, D., Fung, S.W., Li, X., Ruthotto, L.: OT-Flow: Fast and accurate continuous normalizing flows via optimal transport. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 35, pp. 9223–9232 (2021)
41.
Zurück zum Zitat Papamakarios, G., Nalisnick, E., Rezende, D.J., Mohamed, S., Lakshminarayanan, B.: Normalizing flows for probabilistic modeling and inference. J. Mach. Learn. Res. 22(1), 2617–2680 (2021) Papamakarios, G., Nalisnick, E., Rezende, D.J., Mohamed, S., Lakshminarayanan, B.: Normalizing flows for probabilistic modeling and inference. J. Mach. Learn. Res. 22(1), 2617–2680 (2021)
42.
Zurück zum Zitat Pele, O., Werman, M.: A linear time histogram metric for improved sift matching. In: European Conference on Computer Vision, pp. 495–508. Springer (2008) Pele, O., Werman, M.: A linear time histogram metric for improved sift matching. In: European Conference on Computer Vision, pp. 495–508. Springer (2008)
43.
Zurück zum Zitat Peyré, G., Cuturi, M.: Computational optimal transport: with applications to data science. Found. Trends® Mach. Learn. 11(5–6), 355–607 (2019) Peyré, G., Cuturi, M.: Computational optimal transport: with applications to data science. Found. Trends® Mach. Learn. 11(5–6), 355–607 (2019)
44.
Zurück zum Zitat Rezende, D., Mohamed, S.: Variational inference with normalizing flows. In: International Conference on Machine Learning, pp. 1530–1538 (2015) Rezende, D., Mohamed, S.: Variational inference with normalizing flows. In: International Conference on Machine Learning, pp. 1530–1538 (2015)
45.
Zurück zum Zitat Rubner, Y., Guibas, L.J., Tomasi, C.: The Earth Mover’s distance, multi-dimensional scaling, and color-based image retrieval. In: Proceedings of the ARPA Image Understanding Workshop, vol. 661, p. 668 (1997) Rubner, Y., Guibas, L.J., Tomasi, C.: The Earth Mover’s distance, multi-dimensional scaling, and color-based image retrieval. In: Proceedings of the ARPA Image Understanding Workshop, vol. 661, p. 668 (1997)
46.
Zurück zum Zitat Rubner, Y., Tomasi, C., Guibas, L.J.: The Earth Mover’s distance as a metric for image retrieval. Int. J. Comput. Vis. 40(2), 99–121 (2000) Rubner, Y., Tomasi, C., Guibas, L.J.: The Earth Mover’s distance as a metric for image retrieval. Int. J. Comput. Vis. 40(2), 99–121 (2000)
47.
Zurück zum Zitat Ruthotto, L., Osher, S.J., Li, W., Nurbekyan, L., Fung, S.W.: A machine learning framework for solving high-dimensional mean field game and mean field control problems. Proc. Natl. Acad. Sci. 117(17), 9183–9193 (2020) Ruthotto, L., Osher, S.J., Li, W., Nurbekyan, L., Fung, S.W.: A machine learning framework for solving high-dimensional mean field game and mean field control problems. Proc. Natl. Acad. Sci. 117(17), 9183–9193 (2020)
48.
Zurück zum Zitat Salimans, T., Zhang, H., Radford, A., Metaxas, D.: Improving GANs using optimal transport. In: International Conference on Learning Representations (2018) Salimans, T., Zhang, H., Radford, A., Metaxas, D.: Improving GANs using optimal transport. In: International Conference on Learning Representations (2018)
49.
Zurück zum Zitat Santambrogio, F.: Optimal Transport for Applied Mathematicians. Birkäuser, NY 55(58–63), 94 (2015) Santambrogio, F.: Optimal Transport for Applied Mathematicians. Birkäuser, NY 55(58–63), 94 (2015)
50.
Zurück zum Zitat Schiebinger, G., Shu, J., Tabaka, M., Cleary, B., Subramanian, V., Solomon, A., Gould, J., Liu, S., Lin, S., Berube, P., et al.: Optimal-transport analysis of single-cell gene expression identifies developmental trajectories in reprogramming. Cell 176(4), 928–943 (2019)CrossRef Schiebinger, G., Shu, J., Tabaka, M., Cleary, B., Subramanian, V., Solomon, A., Gould, J., Liu, S., Lin, S., Berube, P., et al.: Optimal-transport analysis of single-cell gene expression identifies developmental trajectories in reprogramming. Cell 176(4), 928–943 (2019)CrossRef
51.
Zurück zum Zitat Tabak, E.G., Vanden-Eijnden, E.: Density estimation by dual ascent of the log-likelihood. Commun. Math. Sci. 8(1), 217–233 (2010) Tabak, E.G., Vanden-Eijnden, E.: Density estimation by dual ascent of the log-likelihood. Commun. Math. Sci. 8(1), 217–233 (2010)
52.
Zurück zum Zitat Theis, L., Oord, A.V.D., Bethge, M.: A note on the evaluation of generative models. In: International Conference on Learning Representations (2016) Theis, L., Oord, A.V.D., Bethge, M.: A note on the evaluation of generative models. In: International Conference on Learning Representations (2016)
53.
Zurück zum Zitat Vaswani, A., Shazeer, N., Parmar, N., Uszkoreit, J., Jones, L., Gomez, A.N., Kaiser, Ł., Polosukhin, I.: Attention is all you need. Adv. Neural Inf. Process. Syst. 30 (2017) Vaswani, A., Shazeer, N., Parmar, N., Uszkoreit, J., Jones, L., Gomez, A.N., Kaiser, Ł., Polosukhin, I.: Attention is all you need. Adv. Neural Inf. Process. Syst. 30 (2017)
54.
Zurück zum Zitat Vidal, A., Wu Fung, S., Tenorio, L., Osher, S., Nurbekyan, L.: Taming hyperparameter tuning in continuous normalizing flows using the JKO scheme. Sci. Rep. 13(1), 4501 (2023) Vidal, A., Wu Fung, S., Tenorio, L., Osher, S., Nurbekyan, L.: Taming hyperparameter tuning in continuous normalizing flows using the JKO scheme. Sci. Rep. 13(1), 4501 (2023)
55.
Zurück zum Zitat Villani, C.: Optimal Transport: Old and New, vol. 338. Springer (2009) Villani, C.: Optimal Transport: Old and New, vol. 338. Springer (2009)
56.
Zurück zum Zitat Wang, Z., Zhou, D., Yang, M., Zhang, Y., Rao, C., Wu, H.: Robust document distance with Wasserstein–Fisher–Rao metric. In: Asian Conference on Machine Learning, pp. 721–736 (2020) Wang, Z., Zhou, D., Yang, M., Zhang, Y., Rao, C., Wu, H.: Robust document distance with Wasserstein–Fisher–Rao metric. In: Asian Conference on Machine Learning, pp. 721–736 (2020)
57.
Zurück zum Zitat Wu, J., Wen, L., Green, P.L., Li, J., Maskell, S.: Ensemble Kalman filter based sequential Monte Carlo sampler for sequential Bayesian inference. Stat. Comput. 32(1), 1–14 (2022) Wu, J., Wen, L., Green, P.L., Li, J., Maskell, S.: Ensemble Kalman filter based sequential Monte Carlo sampler for sequential Bayesian inference. Stat. Comput. 32(1), 1–14 (2022)
58.
Zurück zum Zitat Xiong, Z., Li, L., Zhu, Y.-N., Zhang, X.: On the convergence of continuous and discrete unbalanced optimal transport models SIAM J. Numer. Anal. To appear. arXiv preprint arXiv:2303.17267 (2023) Xiong, Z., Li, L., Zhu, Y.-N., Zhang, X.: On the convergence of continuous and discrete unbalanced optimal transport models SIAM J. Numer. Anal. To appear. arXiv preprint arXiv:​2303.​17267 (2023)
59.
Zurück zum Zitat Yang, K.D., Damodaran, K., Venkatachalapathy, S., Soylemezoglu, A.C., Shivashankar, G.V., Uhler, C.: Predicting cell lineages using autoencoders and optimal transport. PLoS Comput. Biol. 16(4), e1007828 (2020) Yang, K.D., Damodaran, K., Venkatachalapathy, S., Soylemezoglu, A.C., Shivashankar, G.V., Uhler, C.: Predicting cell lineages using autoencoders and optimal transport. PLoS Comput. Biol. 16(4), e1007828 (2020)
60.
Zurück zum Zitat Zhou, D., Chen, J., Wu, H., Yang, D., Qiu, L.: The Wasserstein–Fisher–Rao metric for waveform based earthquake location. J. Comput. Math. 41(3), 417–438 (2023) Zhou, D., Chen, J., Wu, H., Yang, D., Qiu, L.: The Wasserstein–Fisher–Rao metric for waveform based earthquake location. J. Comput. Math. 41(3), 417–438 (2023)
Metadaten
Titel
A Machine Learning Framework for Geodesics Under Spherical Wasserstein–Fisher–Rao Metric and Its Application for Weighted Sample Generation
verfasst von
Yang Jing
Jiaheng Chen
Lei Li
Jianfeng Lu
Publikationsdatum
01.01.2024
Verlag
Springer US
Erschienen in
Journal of Scientific Computing / Ausgabe 1/2024
Print ISSN: 0885-7474
Elektronische ISSN: 1573-7691
DOI
https://doi.org/10.1007/s10915-023-02396-y

Weitere Artikel der Ausgabe 1/2024

Journal of Scientific Computing 1/2024 Zur Ausgabe

Premium Partner