Skip to main content
Top

2021 | OriginalPaper | Chapter

Principled Interpolation in Normalizing Flows

Authors : Samuel G. Fadel, Sebastian Mair, Ricardo da S. Torres, Ulf Brefeld

Published in: Machine Learning and Knowledge Discovery in Databases. Research Track

Publisher: Springer International Publishing

Activate our intelligent search to find suitable subject content or patents.

search-config
loading …

Abstract

Generative models based on normalizing flows are very successful in modeling complex data distributions using simpler ones. However, straightforward linear interpolations show unexpected side effects, as interpolation paths lie outside the area where samples are observed. This is caused by the standard choice of Gaussian base distributions and can be seen in the norms of the interpolated samples as they are outside the data manifold. This observation suggests that changing the way of interpolating should generally result in better interpolations, but it is not clear how to do that in an unambiguous way. In this paper, we solve this issue by enforcing a specific manifold and, hence, change the base distribution, to allow for a principled way of interpolation. Specifically, we use the Dirichlet and von Mises-Fisher base distributions on the probability simplex and the hypersphere, respectively. Our experimental results show superior performance in terms of bits per dimension, Fréchet Inception Distance (FID), and Kernel Inception Distance (KID) scores for interpolation, while maintaining the generative performance.

Dont have a licence yet? Then find out more about our products and how to get one now:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Appendix
Available only for authorised users
Literature
1.
go back to reference Agustsson, E., Sage, A., Timofte, R., Gool, L.V.: Optimal transport maps for distribution preserving operations on latent spaces of generative models. In: International Conference on Learning Representations (2019) Agustsson, E., Sage, A., Timofte, R., Gool, L.V.: Optimal transport maps for distribution preserving operations on latent spaces of generative models. In: International Conference on Learning Representations (2019)
2.
go back to reference Arvanitidis, G., Hansen, L.K., Hauberg, S.: Latent space oddity: on the curvature of deep generative models. In: International Conference on Learning Representations (2018) Arvanitidis, G., Hansen, L.K., Hauberg, S.: Latent space oddity: on the curvature of deep generative models. In: International Conference on Learning Representations (2018)
3.
go back to reference Bińkowski, M., Sutherland, D.J., Arbel, M., Gretton, A.: Demystifying MMD GANs. In: International Conference on Learning Representations (2018) Bińkowski, M., Sutherland, D.J., Arbel, M., Gretton, A.: Demystifying MMD GANs. In: International Conference on Learning Representations (2018)
4.
go back to reference Bose, J., Smofsky, A., Liao, R., Panangaden, P., Hamilton, W.: Latent variable modelling with hyperbolic normalizing flows. In: International Conference on Machine Learning, pp. 1045–1055. PMLR (2020) Bose, J., Smofsky, A., Liao, R., Panangaden, P., Hamilton, W.: Latent variable modelling with hyperbolic normalizing flows. In: International Conference on Machine Learning, pp. 1045–1055. PMLR (2020)
5.
go back to reference Brehmer, J., Cranmer, K.: Flows for simultaneous manifold learning and density estimation. In: Advances in Neural Information Processing Systems, vol. 33 (2020) Brehmer, J., Cranmer, K.: Flows for simultaneous manifold learning and density estimation. In: Advances in Neural Information Processing Systems, vol. 33 (2020)
6.
go back to reference Chen, T.Q., Behrmann, J., Duvenaud, D.K., Jacobsen, J.H.: Residual flows for invertible generative modeling. In: Advances in Neural Information Processing Systems, pp. 9913–9923 (2019) Chen, T.Q., Behrmann, J., Duvenaud, D.K., Jacobsen, J.H.: Residual flows for invertible generative modeling. In: Advances in Neural Information Processing Systems, pp. 9913–9923 (2019)
7.
go back to reference Clanuwat, T., Bober-Irizar, M., Kitamoto, A., Lamb, A., Yamamoto, K., Ha, D.: Deep learning for classical Japanese literature. CoRR abs/1812.01718 (2018) Clanuwat, T., Bober-Irizar, M., Kitamoto, A., Lamb, A., Yamamoto, K., Ha, D.: Deep learning for classical Japanese literature. CoRR abs/1812.01718 (2018)
8.
go back to reference Davidson, T.R., Falorsi, L., De Cao, N., Kipf, T., Tomczak, J.M.: Hyperspherical variational auto-encoders. In: Proceedings of the Thirty-Fourth Conference on Uncertainty in Artificial Intelligence, pp. 856–865 (2018) Davidson, T.R., Falorsi, L., De Cao, N., Kipf, T., Tomczak, J.M.: Hyperspherical variational auto-encoders. In: Proceedings of the Thirty-Fourth Conference on Uncertainty in Artificial Intelligence, pp. 856–865 (2018)
9.
go back to reference Dinh, L., Krueger, D., Bengio, Y.: NICE: non-linear independent components estimation. In: 3rd International Conference on Learning Representations, ICLR 2015, San Diego, CA, USA, 7–9 May 2015, Workshop Track Proceedings (2015) Dinh, L., Krueger, D., Bengio, Y.: NICE: non-linear independent components estimation. In: 3rd International Conference on Learning Representations, ICLR 2015, San Diego, CA, USA, 7–9 May 2015, Workshop Track Proceedings (2015)
10.
go back to reference Dinh, L., Shol-Dickstein, J., Bengio, S.: Density estimation using Real NVP. In: International Conference on Learning Representations (2017) Dinh, L., Shol-Dickstein, J., Bengio, S.: Density estimation using Real NVP. In: International Conference on Learning Representations (2017)
11.
go back to reference Gemici, M.C., Rezende, D., Mohamed, S.: Normalizing flows on Riemannian manifolds. CoRR abs/1611.02304 (2016) Gemici, M.C., Rezende, D., Mohamed, S.: Normalizing flows on Riemannian manifolds. CoRR abs/1611.02304 (2016)
12.
go back to reference Goodfellow, I., et al.: Generative adversarial nets. In: Advances in Neural Information Processing Systems, pp. 2672–2680 (2014) Goodfellow, I., et al.: Generative adversarial nets. In: Advances in Neural Information Processing Systems, pp. 2672–2680 (2014)
13.
go back to reference Heusel, M., Ramsauer, H., Unterthiner, T., Nessler, B., Hochreiter, S.: GANs trained by a two time-scale update rule converge to a local nash equilibrium. In: Advances in Neural Information Processing Systems, pp. 6626–6637 (2017) Heusel, M., Ramsauer, H., Unterthiner, T., Nessler, B., Hochreiter, S.: GANs trained by a two time-scale update rule converge to a local nash equilibrium. In: Advances in Neural Information Processing Systems, pp. 6626–6637 (2017)
14.
go back to reference Karras, T., Aila, T., Laine, S., Lehtinen, J.: Progressive growing of GANs for improved quality, stability, and variation. In: International Conference on Learning Representations (2018) Karras, T., Aila, T., Laine, S., Lehtinen, J.: Progressive growing of GANs for improved quality, stability, and variation. In: International Conference on Learning Representations (2018)
15.
go back to reference Kilcher, Y., Lucchi, A., Hofmann, T.: Semantic interpolation in implicit models. In: International Conference on Learning Representations (2018) Kilcher, Y., Lucchi, A., Hofmann, T.: Semantic interpolation in implicit models. In: International Conference on Learning Representations (2018)
16.
go back to reference Kingma, D.P., Ba, J.: Adam: A method for stochastic optimization. In: 3rd International Conference on Learning Representations, ICLR 2015, San Diego, CA, USA, 7–9 May 2015, Conference Track Proceedings (2015) Kingma, D.P., Ba, J.: Adam: A method for stochastic optimization. In: 3rd International Conference on Learning Representations, ICLR 2015, San Diego, CA, USA, 7–9 May 2015, Conference Track Proceedings (2015)
17.
go back to reference Kingma, D.P., Welling, M.: Auto-encoding variational Bayes. In: 2nd International Conference on Learning Representations, ICLR 2014, Banff, AB, Canada, 14–16 April 2014, Conference Track Proceedings (2014) Kingma, D.P., Welling, M.: Auto-encoding variational Bayes. In: 2nd International Conference on Learning Representations, ICLR 2014, Banff, AB, Canada, 14–16 April 2014, Conference Track Proceedings (2014)
18.
go back to reference Kingma, D.P., Welling, M., et al.: An introduction to variational autoencoders. Found. Trends® Mach. Learn. 12(4), 307–392 (2019)CrossRef Kingma, D.P., Welling, M., et al.: An introduction to variational autoencoders. Found. Trends® Mach. Learn. 12(4), 307–392 (2019)CrossRef
19.
go back to reference Kingma, D.P., Dhariwal, P.: Glow: generative flow with invertible 1 x 1 convolutions. In: Advances in Neural Information Processing Systems, pp. 10215–10224 (2018) Kingma, D.P., Dhariwal, P.: Glow: generative flow with invertible 1 x 1 convolutions. In: Advances in Neural Information Processing Systems, pp. 10215–10224 (2018)
20.
go back to reference Kingma, D.P., Salimans, T., Jozefowicz, R., Chen, X., Sutskever, I., Welling, M.: Improved variational inference with inverse autoregressive flow. In: Advances in Neural Information Processing Systems, pp. 4743–4751 (2016) Kingma, D.P., Salimans, T., Jozefowicz, R., Chen, X., Sutskever, I., Welling, M.: Improved variational inference with inverse autoregressive flow. In: Advances in Neural Information Processing Systems, pp. 4743–4751 (2016)
21.
go back to reference Krizhevsky, A., Hinton, G.: Learning multiple layers of features from tiny images. Technical report (2009) Krizhevsky, A., Hinton, G.: Learning multiple layers of features from tiny images. Technical report (2009)
23.
go back to reference Papamakarios, G., Nalisnick, E., Rezende, D.J., Mohamed, S., Lakshminarayanan, B.: Normalizing flows for probabilistic modeling and inference. J. Mach. Learn. Res. 22(57), 1–64 (2021)MathSciNetMATH Papamakarios, G., Nalisnick, E., Rezende, D.J., Mohamed, S., Lakshminarayanan, B.: Normalizing flows for probabilistic modeling and inference. J. Mach. Learn. Res. 22(57), 1–64 (2021)MathSciNetMATH
24.
go back to reference Paszke, A., et al.: Pytorch: an imperative style, high-performance deep learning library. In: Advances in Neural Information Processing Systems, pp. 8024–8035 (2019) Paszke, A., et al.: Pytorch: an imperative style, high-performance deep learning library. In: Advances in Neural Information Processing Systems, pp. 8024–8035 (2019)
25.
go back to reference Radford, A., Metz, L., Chintala, S.: Unsupervised representation learning with deep convolutional generative adversarial networks. In: International Conference on Learning Representations, ICLR 2016 (2016) Radford, A., Metz, L., Chintala, S.: Unsupervised representation learning with deep convolutional generative adversarial networks. In: International Conference on Learning Representations, ICLR 2016 (2016)
26.
go back to reference Rezende, D., Mohamed, S.: Variational inference with normalizing flows. In: International Conference on Machine Learning, pp. 1530–1538. PMLR (2015) Rezende, D., Mohamed, S.: Variational inference with normalizing flows. In: International Conference on Machine Learning, pp. 1530–1538. PMLR (2015)
27.
go back to reference Rezende, D.J., et al.: Normalizing flows on tori and spheres. In: International Conference on Machine Learning, pp. 8083–8092 (2020) Rezende, D.J., et al.: Normalizing flows on tori and spheres. In: International Conference on Machine Learning, pp. 8083–8092 (2020)
28.
go back to reference Rippel, O., Adams, R.P.: High-dimensional probability estimation with deep density models. CoRR abs/1302.5125 (2013) Rippel, O., Adams, R.P.: High-dimensional probability estimation with deep density models. CoRR abs/1302.5125 (2013)
29.
go back to reference Salimans, T., Goodfellow, I., Zaremba, W., Cheung, V., Radford, A., Chen, X.: Improved techniques for training GANs. In: Advances in Neural Information Processing Systems, pp. 2234–2242 (2016) Salimans, T., Goodfellow, I., Zaremba, W., Cheung, V., Radford, A., Chen, X.: Improved techniques for training GANs. In: Advances in Neural Information Processing Systems, pp. 2234–2242 (2016)
30.
go back to reference Shoemake, K.: Animating rotation with quaternion curves. In: Proceedings of the 12th Annual Conference on Computer Graphics and Interactive Techniques, pp. 245–254 (1985) Shoemake, K.: Animating rotation with quaternion curves. In: Proceedings of the 12th Annual Conference on Computer Graphics and Interactive Techniques, pp. 245–254 (1985)
31.
go back to reference Tabak, E.G., Turner, C.V.: A family of nonparametric density estimation algorithms. Commun. Pure Appl. Math. 66(2), 145–164 (2013)MathSciNetCrossRef Tabak, E.G., Turner, C.V.: A family of nonparametric density estimation algorithms. Commun. Pure Appl. Math. 66(2), 145–164 (2013)MathSciNetCrossRef
32.
go back to reference Tabak, E.G., Vanden-Eijnden, E.: Density estimation by dual ascent of the log-likelihood. Commun. Pure Appl. Math. 8(1), 217–233 (2010)MathSciNetMATH Tabak, E.G., Vanden-Eijnden, E.: Density estimation by dual ascent of the log-likelihood. Commun. Pure Appl. Math. 8(1), 217–233 (2010)MathSciNetMATH
33.
go back to reference White, T.: Sampling generative networks. CoRR abs/1609.04468 (2016) White, T.: Sampling generative networks. CoRR abs/1609.04468 (2016)
34.
go back to reference Xiao, H., Rasul, K., Vollgraf, R.: Fashion-MNIST: a novel image dataset for benchmarking machine learning algorithms. CoRR abs/1708.07747 (2017) Xiao, H., Rasul, K., Vollgraf, R.: Fashion-MNIST: a novel image dataset for benchmarking machine learning algorithms. CoRR abs/1708.07747 (2017)
35.
go back to reference Xu, J., Durrett, G.: Spherical latent spaces for stable variational autoencoders. In: Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, pp. 4503–4513 (2018) Xu, J., Durrett, G.: Spherical latent spaces for stable variational autoencoders. In: Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, pp. 4503–4513 (2018)
Metadata
Title
Principled Interpolation in Normalizing Flows
Authors
Samuel G. Fadel
Sebastian Mair
Ricardo da S. Torres
Ulf Brefeld
Copyright Year
2021
DOI
https://doi.org/10.1007/978-3-030-86520-7_8

Premium Partner