Skip to main content
Erschienen in: Computational Mechanics 2/2019

21.05.2019 | Original Paper

Conditional deep surrogate models for stochastic, high-dimensional, and multi-fidelity systems

verfasst von: Yibo Yang, Paris Perdikaris

Erschienen in: Computational Mechanics | Ausgabe 2/2019

Einloggen

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

We present a probabilistic deep learning methodology that enables the construction of predictive data-driven surrogates for stochastic systems. Leveraging recent advances in variational inference with implicit distributions, we put forth a statistical inference framework that enables the end-to-end training of surrogate models on paired input–output observations that may be stochastic in nature, originate from different information sources of variable fidelity, or be corrupted by complex noise processes. The resulting surrogates can accommodate high-dimensional inputs and outputs and are able to return predictions with quantified uncertainty. The effectiveness our approach is demonstrated through a series of canonical studies, including the regression of noisy data, multi-fidelity modeling of stochastic processes, and uncertainty propagation in high-dimensional dynamical systems.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Anhänge
Nur mit Berechtigung zugänglich
Literatur
1.
Zurück zum Zitat Forrester AI, Sóbester A, Keane AJ (2007) Multi-fidelity optimization via surrogate modelling. In: Proceedings of the Royal Society of London A: Mathematical, Physical and Engineering Sciences, vol 463. The Royal Society, pp 3251–3269 Forrester AI, Sóbester A, Keane AJ (2007) Multi-fidelity optimization via surrogate modelling. In: Proceedings of the Royal Society of London A: Mathematical, Physical and Engineering Sciences, vol 463. The Royal Society, pp 3251–3269
2.
Zurück zum Zitat Robinson T, Eldred M, Willcox K, Haimes R (2008) Surrogate-based optimization using multifidelity models with variable parameterization and corrected space mapping. AIAA J 46:2814–2822CrossRef Robinson T, Eldred M, Willcox K, Haimes R (2008) Surrogate-based optimization using multifidelity models with variable parameterization and corrected space mapping. AIAA J 46:2814–2822CrossRef
3.
Zurück zum Zitat Alexandrov NM, Lewis RM, Gumbert CR, Green LL, Newman PA (2001) Approximation and model management in aerodynamic optimization with variable-fidelity models. J Aircr 38:1093–1101CrossRef Alexandrov NM, Lewis RM, Gumbert CR, Green LL, Newman PA (2001) Approximation and model management in aerodynamic optimization with variable-fidelity models. J Aircr 38:1093–1101CrossRef
4.
Zurück zum Zitat Sun G, Li G, Stone M, Li Q (2010) A two-stage multi-fidelity optimization procedure for honeycomb-type cellular materials. Comput Mater Sci 49:500–511CrossRef Sun G, Li G, Stone M, Li Q (2010) A two-stage multi-fidelity optimization procedure for honeycomb-type cellular materials. Comput Mater Sci 49:500–511CrossRef
5.
Zurück zum Zitat Sun G, Li G, Zhou S, Xu W, Yang X, Li Q (2011) Multi-fidelity optimization for sheet metal forming process. Struct Multidiscip Optim 44:111–124CrossRef Sun G, Li G, Zhou S, Xu W, Yang X, Li Q (2011) Multi-fidelity optimization for sheet metal forming process. Struct Multidiscip Optim 44:111–124CrossRef
6.
Zurück zum Zitat Celik N, Lee S, Vasudevan K, Son Y-J (2010) DDDAS-based multi-fidelity simulation framework for supply chain systems. IIE Trans 42:325–341CrossRef Celik N, Lee S, Vasudevan K, Son Y-J (2010) DDDAS-based multi-fidelity simulation framework for supply chain systems. IIE Trans 42:325–341CrossRef
7.
Zurück zum Zitat Perdikaris P, Karniadakis GE (2016) Model inversion via multi-fidelity Bayesian optimization: a new paradigm for parameter estimation in haemodynamics, and beyond. J R Soc Interface 13:20151107CrossRef Perdikaris P, Karniadakis GE (2016) Model inversion via multi-fidelity Bayesian optimization: a new paradigm for parameter estimation in haemodynamics, and beyond. J R Soc Interface 13:20151107CrossRef
8.
Zurück zum Zitat Perdikaris P (2015) Data-driven parallel scientific computing: multi-fidelity information fusion algorithms and applications to physical and biological systems. Ph.D. thesis, Brown University Perdikaris P (2015) Data-driven parallel scientific computing: multi-fidelity information fusion algorithms and applications to physical and biological systems. Ph.D. thesis, Brown University
9.
Zurück zum Zitat Eldred M, Burkardt J (2009) Comparison of non-intrusive polynomial chaos and stochastic collocation methods for uncertainty quantification. In: 47th AIAA aerospace sciences meeting including the new horizons forum and aerospace exposition, p 976 Eldred M, Burkardt J (2009) Comparison of non-intrusive polynomial chaos and stochastic collocation methods for uncertainty quantification. In: 47th AIAA aerospace sciences meeting including the new horizons forum and aerospace exposition, p 976
10.
Zurück zum Zitat Ng LW-T, Eldred M (2012) Multifidelity uncertainty quantification using non-intrusive polynomial chaos and stochastic collocation. In: 53rd AIAA/ASME/ASCE/AHS/ASC structures, structural dynamics and materials conference 20th AIAA/ASME/AHS adaptive structures conference 14th AIAA, p 1852 Ng LW-T, Eldred M (2012) Multifidelity uncertainty quantification using non-intrusive polynomial chaos and stochastic collocation. In: 53rd AIAA/ASME/ASCE/AHS/ASC structures, structural dynamics and materials conference 20th AIAA/ASME/AHS adaptive structures conference 14th AIAA, p 1852
11.
Zurück zum Zitat Padron AS, Alonso JJ, Palacios F, Barone MF, Eldred MS (2014) Multi-fidelity uncertainty quantification: application to a vertical axis wind turbine under an extreme gust. In: 15th AIAA/ISSMO multidisciplinary analysis and optimization conference, p 3013 Padron AS, Alonso JJ, Palacios F, Barone MF, Eldred MS (2014) Multi-fidelity uncertainty quantification: application to a vertical axis wind turbine under an extreme gust. In: 15th AIAA/ISSMO multidisciplinary analysis and optimization conference, p 3013
12.
Zurück zum Zitat Biehler J, Gee MW, Wall WA (2015) Towards efficient uncertainty quantification in complex and large-scale biomechanical problems based on a Bayesian multi-fidelity scheme. Biomech Model Mechanobiol 14:489–513CrossRef Biehler J, Gee MW, Wall WA (2015) Towards efficient uncertainty quantification in complex and large-scale biomechanical problems based on a Bayesian multi-fidelity scheme. Biomech Model Mechanobiol 14:489–513CrossRef
13.
Zurück zum Zitat Peherstorfer B, Willcox K, Gunzburger M (2016) Optimal model management for multifidelity Monte Carlo estimation. SIAM J Sci Comput 38:A3163–A3194MathSciNetCrossRefMATH Peherstorfer B, Willcox K, Gunzburger M (2016) Optimal model management for multifidelity Monte Carlo estimation. SIAM J Sci Comput 38:A3163–A3194MathSciNetCrossRefMATH
14.
Zurück zum Zitat Peherstorfer B, Cui T, Marzouk Y, Willcox K (2016) Multifidelity importance sampling. Comput Methods Appl Mech Eng 300:490–509MathSciNetCrossRefMATH Peherstorfer B, Cui T, Marzouk Y, Willcox K (2016) Multifidelity importance sampling. Comput Methods Appl Mech Eng 300:490–509MathSciNetCrossRefMATH
15.
Zurück zum Zitat Peherstorfer B, Willcox K, Gunzburger M (2016) Survey of multifidelity methods in uncertainty propagation, inference, and optimization. Preprint, pp 1–57 Peherstorfer B, Willcox K, Gunzburger M (2016) Survey of multifidelity methods in uncertainty propagation, inference, and optimization. Preprint, pp 1–57
16.
Zurück zum Zitat Narayan A, Gittelson C, Xiu D (2014) A stochastic collocation algorithm with multifidelity models. SIAM J Sci Comput 36:A495–A521MathSciNetCrossRefMATH Narayan A, Gittelson C, Xiu D (2014) A stochastic collocation algorithm with multifidelity models. SIAM J Sci Comput 36:A495–A521MathSciNetCrossRefMATH
17.
Zurück zum Zitat Zhu X, Narayan A, Xiu D (2014) Computational aspects of stochastic collocation with multifidelity models. SIAM/ASA J Uncertain Quantif 2:444–463MathSciNetCrossRefMATH Zhu X, Narayan A, Xiu D (2014) Computational aspects of stochastic collocation with multifidelity models. SIAM/ASA J Uncertain Quantif 2:444–463MathSciNetCrossRefMATH
18.
Zurück zum Zitat Bilionis I, Zabaras N, Konomi BA, Lin G (2013) Multi-output separable Gaussian process: towards an efficient, fully Bayesian paradigm for uncertainty quantification. J Comput Phys 241:212–239MathSciNetCrossRefMATH Bilionis I, Zabaras N, Konomi BA, Lin G (2013) Multi-output separable Gaussian process: towards an efficient, fully Bayesian paradigm for uncertainty quantification. J Comput Phys 241:212–239MathSciNetCrossRefMATH
19.
Zurück zum Zitat Parussini L, Venturi D, Perdikaris P, Karniadakis G (2017) Multi-fidelity Gaussian process regression for prediction of random fields. J Comput Phys 336:36–50MathSciNetCrossRefMATH Parussini L, Venturi D, Perdikaris P, Karniadakis G (2017) Multi-fidelity Gaussian process regression for prediction of random fields. J Comput Phys 336:36–50MathSciNetCrossRefMATH
20.
Zurück zum Zitat Perdikaris P, Venturi D, Karniadakis GE (2016) Multifidelity information fusion algorithms for high-dimensional systems and massive data sets. SIAM J Sci Comput 38:B521–B538MathSciNetCrossRefMATH Perdikaris P, Venturi D, Karniadakis GE (2016) Multifidelity information fusion algorithms for high-dimensional systems and massive data sets. SIAM J Sci Comput 38:B521–B538MathSciNetCrossRefMATH
21.
Zurück zum Zitat Rasmussen CE (2004) Gaussian processes in machine learning. In: Bousquet O, von Luxburg U, Rätsch G (eds) Advanced lectures on machine learning. ML 2003. Lecture notes in computer science, vol 3176. Springer, Berlin, Heidelberg, pp 63–71 Rasmussen CE (2004) Gaussian processes in machine learning. In: Bousquet O, von Luxburg U, Rätsch G (eds) Advanced lectures on machine learning. ML 2003. Lecture notes in computer science, vol 3176. Springer, Berlin, Heidelberg, pp 63–71
23.
Zurück zum Zitat Sohn K, Lee H, Yan X (2015) Learning structured output representation using deep conditional generative models. In: Cortes C, Lawrence ND, Lee DD, Sugiyama M, Garnett R (eds) Advances in neural information processing systems 28. Curran Associates, Inc., pp 3483–3491 Sohn K, Lee H, Yan X (2015) Learning structured output representation using deep conditional generative models. In: Cortes C, Lawrence ND, Lee DD, Sugiyama M, Garnett R (eds) Advances in neural information processing systems 28. Curran Associates, Inc., pp 3483–3491
24.
Zurück zum Zitat Vincent P, Larochelle H, Bengio Y, Manzagol P-A (2008) Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th international conference on machine learning. ACM, pp 1096–1103 Vincent P, Larochelle H, Bengio Y, Manzagol P-A (2008) Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th international conference on machine learning. ACM, pp 1096–1103
25.
Zurück zum Zitat Vincent P, Larochelle H, Lajoie I, Bengio Y, Manzagol P-A (2010) Stacked denoising autoencoders: learning useful representations in a deep network with a local denoising criterion. J Mach Learn Res 11:3371–3408MathSciNetMATH Vincent P, Larochelle H, Lajoie I, Bengio Y, Manzagol P-A (2010) Stacked denoising autoencoders: learning useful representations in a deep network with a local denoising criterion. J Mach Learn Res 11:3371–3408MathSciNetMATH
26.
Zurück zum Zitat Gómez-Bombarelli R et al (2016) Design of efficient molecular organic light-emitting diodes by a high-throughput virtual screening and experimental approach. Nat Mater 15:1120–1127CrossRef Gómez-Bombarelli R et al (2016) Design of efficient molecular organic light-emitting diodes by a high-throughput virtual screening and experimental approach. Nat Mater 15:1120–1127CrossRef
27.
Zurück zum Zitat Gómez-Bombarelli R, Wei JN, Duvenaud D, Hernández-Lobato JM, Sánchez-Lengeling B, Sheberla D, Aguilera-Iparraguirre J, Hirzel TD, Adams RP, Aspuru-Guzik A (2018) Automatic chemical design using a data-driven continuous representation of molecules. ACS Cent Sci 4:268–276CrossRef Gómez-Bombarelli R, Wei JN, Duvenaud D, Hernández-Lobato JM, Sánchez-Lengeling B, Sheberla D, Aguilera-Iparraguirre J, Hirzel TD, Adams RP, Aspuru-Guzik A (2018) Automatic chemical design using a data-driven continuous representation of molecules. ACS Cent Sci 4:268–276CrossRef
28.
Zurück zum Zitat Ravanbakhsh S, Lanusse F, Mandelbaum R, Schneider JG, Poczos B (2017) Enabling dark energy science with deep generative models of galaxy images. In: AAAI, pp 1488–1494 Ravanbakhsh S, Lanusse F, Mandelbaum R, Schneider JG, Poczos B (2017) Enabling dark energy science with deep generative models of galaxy images. In: AAAI, pp 1488–1494
29.
Zurück zum Zitat Lopez R, Regier J, Cole M, Jordan M, Yosef N (2017) A deep generative model for single-cell RNA sequencing with application to detecting differentially expressed genes. arXiv preprint arXiv:1710.05086 Lopez R, Regier J, Cole M, Jordan M, Yosef N (2017) A deep generative model for single-cell RNA sequencing with application to detecting differentially expressed genes. arXiv preprint arXiv:​1710.​05086
30.
Zurück zum Zitat Way GP, Greene CS (2017) Extracting a biologically relevant latent space from cancer transcriptomes with variational autoencoders. bioRxiv, pp 174474 Way GP, Greene CS (2017) Extracting a biologically relevant latent space from cancer transcriptomes with variational autoencoders. bioRxiv, pp 174474
31.
Zurück zum Zitat Bousquet O, Gelly S, Tolstikhin I, Simon-Gabriel C-J, Schoelkopf B (2017) From optimal transport to generative modeling: the VEGAN cookbook. arXiv preprint arXiv:1705.07642 Bousquet O, Gelly S, Tolstikhin I, Simon-Gabriel C-J, Schoelkopf B (2017) From optimal transport to generative modeling: the VEGAN cookbook. arXiv preprint arXiv:​1705.​07642
32.
Zurück zum Zitat Pu Y, Chen L, Dai S, Wang W, Li C, Carin L (2017) Symmetric variational autoencoder and connections to adversarial learning. arXiv preprint arXiv:1709.01846 Pu Y, Chen L, Dai S, Wang W, Li C, Carin L (2017) Symmetric variational autoencoder and connections to adversarial learning. arXiv preprint arXiv:​1709.​01846
33.
34.
Zurück zum Zitat Zheng H, Yao J, Zhang Y, Tsang IW (2018) Degeneration in VAE: in the light of fisher information loss. arXiv preprint arXiv:1802.06677 Zheng H, Yao J, Zhang Y, Tsang IW (2018) Degeneration in VAE: in the light of fisher information loss. arXiv preprint arXiv:​1802.​06677
35.
Zurück zum Zitat Kingma DP, Salimans T, Jozefowicz R, Chen X, Sutskever I, Welling M (2016) Improved variational inference with inverse autoregressive flow. In: Lee DD, Sugiyama M, Luxburg UV, Guyon I, Garnett R (eds) Advances in neural information processing systems 29. Curran Associates, Inc., pp 4743–4751 Kingma DP, Salimans T, Jozefowicz R, Chen X, Sutskever I, Welling M (2016) Improved variational inference with inverse autoregressive flow. In: Lee DD, Sugiyama M, Luxburg UV, Guyon I, Garnett R (eds) Advances in neural information processing systems 29. Curran Associates, Inc., pp 4743–4751
37.
Zurück zum Zitat Burgess CP, Higgins I, Pal A, Matthey L, Watters N, Desjardins G, Lerchner A (2018) Understanding disentangling in \(\beta \)-VAE. arXiv preprint arXiv:1804.03599 Burgess CP, Higgins I, Pal A, Matthey L, Watters N, Desjardins G, Lerchner A (2018) Understanding disentangling in \(\beta \)-VAE. arXiv preprint arXiv:​1804.​03599
38.
39.
Zurück zum Zitat Chen TQ, Li X, Grosse R, Duvenaud D (2018) Isolating sources of disentanglement in variational autoencoders. arXiv preprint arXiv:1802.04942 Chen TQ, Li X, Grosse R, Duvenaud D (2018) Isolating sources of disentanglement in variational autoencoders. arXiv preprint arXiv:​1802.​04942
41.
Zurück zum Zitat Domke J, Sheldon DR (2018) Importance weighting and variational inference. In: Bengio S, Wallach H, Larochelle H, Grauman K, Cesa-Bianchi N, Garnett R (eds) Advances in neural information processing systems 31. Curran Associates, Inc., pp 4470–4479 Domke J, Sheldon DR (2018) Importance weighting and variational inference. In: Bengio S, Wallach H, Larochelle H, Grauman K, Cesa-Bianchi N, Garnett R (eds) Advances in neural information processing systems 31. Curran Associates, Inc., pp 4470–4479
42.
43.
Zurück zum Zitat Villani C (2008) Optimal transport: old and new, vol 338. Springer, BerlinMATH Villani C (2008) Optimal transport: old and new, vol 338. Springer, BerlinMATH
45.
Zurück zum Zitat van den Oord A, Kalchbrenner N, Espeholt L, kavukcuoglu k, Vinyals O, Graves Alex (2016) Conditional image generation with PixelCNN decoders. In: Lee DD, Sugiyama M, Luxburg UV, Guyon I, Garnett R (eds) Advances in neural information processing systems 29. Curran Associates, Inc., pp 4790–4798 van den Oord A, Kalchbrenner N, Espeholt L, kavukcuoglu k, Vinyals O, Graves Alex (2016) Conditional image generation with PixelCNN decoders. In: Lee DD, Sugiyama M, Luxburg UV, Guyon I, Garnett R (eds) Advances in neural information processing systems 29. Curran Associates, Inc., pp 4790–4798
46.
Zurück zum Zitat Liu Q, Wang D (2016) Stein variational gradient descent: a general purpose Bayesian inference algorithm. In: Lee DD, Sugiyama M, Luxburg UV, Guyon I, Garnett R (eds) Advances in neural information processing systems 29. Curran Associates, Inc., pp 2378–2386 Liu Q, Wang D (2016) Stein variational gradient descent: a general purpose Bayesian inference algorithm. In: Lee DD, Sugiyama M, Luxburg UV, Guyon I, Garnett R (eds) Advances in neural information processing systems 29. Curran Associates, Inc., pp 2378–2386
47.
Zurück zum Zitat Mescheder L, Nowozin S, Geiger A (2017) Adversarial variational bayes: unifying variational autoencoders and generative adversarial networks. arXiv preprint arXiv:1701.04722 Mescheder L, Nowozin S, Geiger A (2017) Adversarial variational bayes: unifying variational autoencoders and generative adversarial networks. arXiv preprint arXiv:​1701.​04722
48.
50.
Zurück zum Zitat Titsias MK (2017) Learning model reparametrizations: implicit variational inference by fitting MCMC distributions. arXiv preprint arXiv:1708.01529 Titsias MK (2017) Learning model reparametrizations: implicit variational inference by fitting MCMC distributions. arXiv preprint arXiv:​1708.​01529
51.
Zurück zum Zitat Blei DM, Kucukelbir A, McAuliffe JD (2017) Variational inference: a review for statisticians. J Am Stat Assoc 112:859–877MathSciNetCrossRef Blei DM, Kucukelbir A, McAuliffe JD (2017) Variational inference: a review for statisticians. J Am Stat Assoc 112:859–877MathSciNetCrossRef
52.
Zurück zum Zitat Wainwright MJ, Jordan MI et al (2008) Graphical models, exponential families, and variational inference. Found Trends Mach Learn 1:1–305CrossRefMATH Wainwright MJ, Jordan MI et al (2008) Graphical models, exponential families, and variational inference. Found Trends Mach Learn 1:1–305CrossRefMATH
53.
Zurück zum Zitat Goodfellow I, Pouget-Abadie J, Mirza M, Xu B, Warde-Farley D, Ozair S, Courville A, Bengio Y (2014) Generative adversarial nets. In: Ghahramani Z, Welling M, Cortes C, Lawrence ND, Weinberger KQ (eds) Advances in neural information processing systems 27. Curran Associates, Inc., pp 2672–2680 Goodfellow I, Pouget-Abadie J, Mirza M, Xu B, Warde-Farley D, Ozair S, Courville A, Bengio Y (2014) Generative adversarial nets. In: Ghahramani Z, Welling M, Cortes C, Lawrence ND, Weinberger KQ (eds) Advances in neural information processing systems 27. Curran Associates, Inc., pp 2672–2680
54.
Zurück zum Zitat Li C (2018) Towards better representations with deep/Bayesian learning. Duke University Li C (2018) Towards better representations with deep/Bayesian learning. Duke University
55.
Zurück zum Zitat Salimans T, Goodfellow I, Zaremba W, Cheung V, Radford A, Chen X (2016) Improved techniques for training gans. In: Lee DD, Sugiyama M, Luxburg UV, Guyon I, Garnett R (eds) Advances in neural information processing systems 29. Curran Associates, Inc., pp 2234–2242 Salimans T, Goodfellow I, Zaremba W, Cheung V, Radford A, Chen X (2016) Improved techniques for training gans. In: Lee DD, Sugiyama M, Luxburg UV, Guyon I, Garnett R (eds) Advances in neural information processing systems 29. Curran Associates, Inc., pp 2234–2242
56.
Zurück zum Zitat Akaike H (1998) Information theory and an extension of the maximum likelihood principle. In: Parzen E, Tanabe K, Kitagawa G (eds) Selected papers of Hirotugu Akaike. Springer, Berin, pp 199–213CrossRef Akaike H (1998) Information theory and an extension of the maximum likelihood principle. In: Parzen E, Tanabe K, Kitagawa G (eds) Selected papers of Hirotugu Akaike. Springer, Berin, pp 199–213CrossRef
57.
Zurück zum Zitat Friedman J, Hastie T, Tibshirani R (2001) The elements of statistical learning, Springer Series in Statistics, vol 1. Springer, New YorkMATH Friedman J, Hastie T, Tibshirani R (2001) The elements of statistical learning, Springer Series in Statistics, vol 1. Springer, New YorkMATH
58.
Zurück zum Zitat Gulrajani I, Ahmed F, Arjovsky M, Dumoulin V, Courville AC (2017) Improved training of Wasserstein GANs. In: Advances in neural information processing systems, pp 5767–5777 Gulrajani I, Ahmed F, Arjovsky M, Dumoulin V, Courville AC (2017) Improved training of Wasserstein GANs. In: Advances in neural information processing systems, pp 5767–5777
60.
Zurück zum Zitat Yang L, Zhang D, Karniadakis GE (2018) Physics-informed generative adversarial networks for stochastic differential equations. arXiv preprint arXiv:1811.02033 Yang L, Zhang D, Karniadakis GE (2018) Physics-informed generative adversarial networks for stochastic differential equations. arXiv preprint arXiv:​1811.​02033
61.
Zurück zum Zitat Schöberl M, Zabaras N, Koutsourelakis P-S (2019) Predictive collective variable discovery with deep Bayesian models. J Chem Phys 150:024109CrossRef Schöberl M, Zabaras N, Koutsourelakis P-S (2019) Predictive collective variable discovery with deep Bayesian models. J Chem Phys 150:024109CrossRef
62.
Zurück zum Zitat Grigo C, Koutsourelakis P-S (2019) A physics-aware, probabilistic machine learning framework for coarse-graining high-dimensional systems in the small data regime. arXiv preprint arXiv:1902.03968 Grigo C, Koutsourelakis P-S (2019) A physics-aware, probabilistic machine learning framework for coarse-graining high-dimensional systems in the small data regime. arXiv preprint arXiv:​1902.​03968
64.
Zurück zum Zitat Glorot X, Bengio Y (2010) Understanding the difficulty of training deep feedforward neural networks. In: Proceedings of the thirteenth international conference on artificial intelligence and statistics, pp 249–256 Glorot X, Bengio Y (2010) Understanding the difficulty of training deep feedforward neural networks. In: Proceedings of the thirteenth international conference on artificial intelligence and statistics, pp 249–256
65.
Zurück zum Zitat Abadi M, Barham P, Chen J, Chen Z, Davis A, Dean J, Devin M, Ghemawat S, Irving G, Isard M et al (2016) Tensorflow: a system for large-scale machine learning. In: OSDI, vol 16, pp 265–283 Abadi M, Barham P, Chen J, Chen Z, Davis A, Dean J, Devin M, Ghemawat S, Irving G, Isard M et al (2016) Tensorflow: a system for large-scale machine learning. In: OSDI, vol 16, pp 265–283
66.
Zurück zum Zitat Goodfellow I, Bengio Y, Courville A, Bengio Y (2016) Deep learning, vol 1. MIT Press, CambridgeMATH Goodfellow I, Bengio Y, Courville A, Bengio Y (2016) Deep learning, vol 1. MIT Press, CambridgeMATH
67.
Zurück zum Zitat Neal RM (2012) Bayesian learning for neural networks, vol 118. Springer, BerlinMATH Neal RM (2012) Bayesian learning for neural networks, vol 118. Springer, BerlinMATH
68.
Zurück zum Zitat Kennedy MC, O’Hagan A (2000) Predicting the output from a complex computer code when fast approximations are available. Biometrika 87:1–13MathSciNetCrossRefMATH Kennedy MC, O’Hagan A (2000) Predicting the output from a complex computer code when fast approximations are available. Biometrika 87:1–13MathSciNetCrossRefMATH
69.
Zurück zum Zitat Perdikaris P, Raissi M, Damianou A, Lawrence N, Karniadakis G (2016) Nonlinear information fusion algorithms for data-efficient multi-fidelity modelling. In: Proceedings of Royal Society A, vol 473. The Royal Society, p 20160751 Perdikaris P, Raissi M, Damianou A, Lawrence N, Karniadakis G (2016) Nonlinear information fusion algorithms for data-efficient multi-fidelity modelling. In: Proceedings of Royal Society A, vol 473. The Royal Society, p 20160751
70.
Zurück zum Zitat Burgers JM (1948) A mathematical model illustrating the theory of turbulence. In: von Mises R, von Karman T (eds) Advances in applied mechanics, vol 1. Elsevier, Amsterdam, pp 171–199 Burgers JM (1948) A mathematical model illustrating the theory of turbulence. In: von Mises R, von Karman T (eds) Advances in applied mechanics, vol 1. Elsevier, Amsterdam, pp 171–199
72.
Zurück zum Zitat Krizhevsky A, Sutskever I, Hinton GE (2012) Imagenet classification with deep convolutional neural networks. In: Pereira F, Burges CJC, Bottou L, Weinberger KQ (eds) Advances in neural information processing systems 25. Curran Associates, Inc., pp 1097–1105 Krizhevsky A, Sutskever I, Hinton GE (2012) Imagenet classification with deep convolutional neural networks. In: Pereira F, Burges CJC, Bottou L, Weinberger KQ (eds) Advances in neural information processing systems 25. Curran Associates, Inc., pp 1097–1105
73.
Zurück zum Zitat LeCun Y, Bengio Y, Hinton G (2015) Deep learning. Nature 521:436–444CrossRef LeCun Y, Bengio Y, Hinton G (2015) Deep learning. Nature 521:436–444CrossRef
74.
Zurück zum Zitat Mallat S (2016) Understanding deep convolutional networks. Philos Trans R Soc A 374:20150203CrossRef Mallat S (2016) Understanding deep convolutional networks. Philos Trans R Soc A 374:20150203CrossRef
75.
Zurück zum Zitat Ioffe S, Szegedy C (2015) Batch normalization: accelerating deep network training by reducing internal covariate shift. arXiv preprint arXiv:1502.03167 Ioffe S, Szegedy C (2015) Batch normalization: accelerating deep network training by reducing internal covariate shift. arXiv preprint arXiv:​1502.​03167
76.
Zurück zum Zitat Cohn DA, Ghahramani Z, Jordan MI (1996) Active learning with statistical models. J Artif Intell Res 4:129–145CrossRefMATH Cohn DA, Ghahramani Z, Jordan MI (1996) Active learning with statistical models. J Artif Intell Res 4:129–145CrossRefMATH
77.
Zurück zum Zitat Shahriari B, Swersky K, Wang Z, Adams RP, De Freitas N (2016) Taking the human out of the loop: a review of Bayesian optimization. Proc IEEE 104:148–175CrossRef Shahriari B, Swersky K, Wang Z, Adams RP, De Freitas N (2016) Taking the human out of the loop: a review of Bayesian optimization. Proc IEEE 104:148–175CrossRef
78.
Zurück zum Zitat Yang Y, Perdikaris P (2018) Adversarial uncertainty quantification in physics-informed neural networks. arXiv preprint arXiv:1811.04026 Yang Y, Perdikaris P (2018) Adversarial uncertainty quantification in physics-informed neural networks. arXiv preprint arXiv:​1811.​04026
Metadaten
Titel
Conditional deep surrogate models for stochastic, high-dimensional, and multi-fidelity systems
verfasst von
Yibo Yang
Paris Perdikaris
Publikationsdatum
21.05.2019
Verlag
Springer Berlin Heidelberg
Erschienen in
Computational Mechanics / Ausgabe 2/2019
Print ISSN: 0178-7675
Elektronische ISSN: 1432-0924
DOI
https://doi.org/10.1007/s00466-019-01718-y

Weitere Artikel der Ausgabe 2/2019

Computational Mechanics 2/2019 Zur Ausgabe

Neuer Inhalt