Skip to main content

2015 | OriginalPaper | Buchkapitel

8. MCMC-Driven Adaptive Multiple Importance Sampling

verfasst von : Luca Martino, Víctor Elvira, David Luengo, Jukka Corander

Erschienen in: Interdisciplinary Bayesian Statistics

Verlag: Springer International Publishing

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

Monte Carlo (MC) methods are widely used for statistical inference and stochastic optimization. A well-known class of MC methods is composed of importance sampling (IS) and its adaptive extensions (such as adaptive multiple IS and population MC). In this work, we introduce an iterated batch importance sampler using a population of proposal densities, which are adapted according to a Markov Chain Monte Carlo (MCMC) technique over the population of location parameters. The novel algorithm provides a global estimation of the variables of interest iteratively, using all the generated samples weighted according to the so-called deterministic mixture scheme. Compared with a traditional multiple IS scheme with the same number of samples, the performance is substantially improved at the expense of a slight increase in the computational cost due to the additional MCMC steps. Moreover, the dependence on the choice of the cloud of proposals is sensibly reduced, since the proposal density in the MCMC method can be adapted in order to optimize the performance. Numerical results show the advantages of the proposed sampling scheme in terms of mean absolute error.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Fußnoten
1
A fair comparison of the computational cost of MAMIS and PMC deserves a further discussion. On the one hand, in MAMIS we use NT samples for the estimation and (setting \(\mathcal{T}=T_a\), as in this example) we also perform T iterations of the SMH technique, which requires drawing T proposed samples, T samples from multinomial pdfs, and T uniform random variables (RVs; thus, we generate 3T additional RVs with respect to a simple iterative IS scheme). Moreover, the target needs to be evaluated at T new points. On the other hand, in PMC we use NT samples in the estimation and we also need to draw NT samples from multinomial densities (resampling steps), thus requiring NT additional RVs with respect to a simple iterative IS scheme.
 
Literatur
1.
Zurück zum Zitat Andrieu, C., de Freitas, N., Doucet, A., Jordan, M.: An introduction to MCMC for machine learning. Mach. Learn. 50, 5–43 (2003)CrossRefMATH Andrieu, C., de Freitas, N., Doucet, A., Jordan, M.: An introduction to MCMC for machine learning. Mach. Learn. 50, 5–43 (2003)CrossRefMATH
2.
Zurück zum Zitat Cappé, O., Guillin, A., Marin, J.M., Robert, C.P.: Population Monte Carlo. J. Comput. Graph. Stat. 13(4), 907–929 (2004)CrossRef Cappé, O., Guillin, A., Marin, J.M., Robert, C.P.: Population Monte Carlo. J. Comput. Graph. Stat. 13(4), 907–929 (2004)CrossRef
3.
Zurück zum Zitat Cornuet, J.M., Marin, J.M., Mira, A., Robert, C.P.: Adaptive multiple importance sampling. Scand. J. Stat. 39(4), 798–812 (2012)CrossRefMATHMathSciNet Cornuet, J.M., Marin, J.M., Mira, A., Robert, C.P.: Adaptive multiple importance sampling. Scand. J. Stat. 39(4), 798–812 (2012)CrossRefMATHMathSciNet
4.
Zurück zum Zitat Doucet, A., Wang, X.: Monte Carlo methods for signal processing. IEEE Signal Process. Mag. 22(6), 152–170 (2005)CrossRef Doucet, A., Wang, X.: Monte Carlo methods for signal processing. IEEE Signal Process. Mag. 22(6), 152–170 (2005)CrossRef
5.
Zurück zum Zitat Fitzgerald, W.J.: Markov chain Monte Carlo methods with applications to signal processing. Signal Process. 81(1), 3–18 (2001)CrossRefMATH Fitzgerald, W.J.: Markov chain Monte Carlo methods with applications to signal processing. Signal Process. 81(1), 3–18 (2001)CrossRefMATH
7.
Zurück zum Zitat Kotecha, J., Djurić, P.M.: Gibbs sampling approach for generation of truncated multivariate gaussian random variables. Proceedings of IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP), Phoenix, Arizona, USA (1999) Kotecha, J., Djurić, P.M.: Gibbs sampling approach for generation of truncated multivariate gaussian random variables. Proceedings of IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP), Phoenix, Arizona, USA (1999)
7.
Zurück zum Zitat Liang, F., Liu, C., Caroll, R.: Advanced Markov Chain Monte Carlo Methods: Learning from Past Samples. Wiley Series in Computational Statistics, Wiley, USA (2010) Liang, F., Liu, C., Caroll, R.: Advanced Markov Chain Monte Carlo Methods: Learning from Past Samples. Wiley Series in Computational Statistics, Wiley, USA (2010)
9.
Zurück zum Zitat Liu, J.S.: Monte Carlo Strategies in Scientific Computing., Springer, Vancouver, Canada (2004) Liu, J.S.: Monte Carlo Strategies in Scientific Computing., Springer, Vancouver, Canada (2004)
10.
Zurück zum Zitat Luengo, D., Martino, L.: Fully adaptive Gaussian mixture Metropolis-Hastings algorithm. Proceedings of IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP), Vancouver, Canada (2013) Luengo, D., Martino, L.: Fully adaptive Gaussian mixture Metropolis-Hastings algorithm. Proceedings of IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP), Vancouver, Canada (2013)
12.
Zurück zum Zitat Robert, C.P., Casella, G.: Monte Carlo Statistical Methods. Springer, USA (2004) Robert, C.P., Casella, G.: Monte Carlo Statistical Methods. Springer, USA (2004)
13.
Zurück zum Zitat Skilling, J.: Nested sampling for general Bayesian computation. Bayesian Anal. 1(4), 833–860 (2006)MathSciNet Skilling, J.: Nested sampling for general Bayesian computation. Bayesian Anal. 1(4), 833–860 (2006)MathSciNet
14.
Zurück zum Zitat Veach, E., Guibas, L.: Optimally combining sampling techniques for Monte Carlo rendering. In SIGGRAPH 1995 Proceedings, Los Angeles, California, USA. pp. 419–428 (1995) Veach, E., Guibas, L.: Optimally combining sampling techniques for Monte Carlo rendering. In SIGGRAPH 1995 Proceedings, Los Angeles, California, USA. pp. 419–428 (1995)
15.
Zurück zum Zitat Wang, X., Chen, R., Liu, J.S.: Monte Carlo Bayesian signal processing for wireless communications. J. VLSI Signal Process. 30, 89–105 (2002)CrossRefMATH Wang, X., Chen, R., Liu, J.S.: Monte Carlo Bayesian signal processing for wireless communications. J. VLSI Signal Process. 30, 89–105 (2002)CrossRefMATH
Metadaten
Titel
MCMC-Driven Adaptive Multiple Importance Sampling
verfasst von
Luca Martino
Víctor Elvira
David Luengo
Jukka Corander
Copyright-Jahr
2015
DOI
https://doi.org/10.1007/978-3-319-12454-4_8