Skip to main content
Erschienen in:
Buchtitelbild

2021 | OriginalPaper | Buchkapitel

Learning Enabled Constrained Black-Box Optimization

verfasst von : F. Archetti, A. Candelieri, B. G. Galuzzi, R. Perego

Erschienen in: Black Box Optimization, Machine Learning, and No-Free Lunch Theorems

Verlag: Springer International Publishing

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

This chapter looks at the issue of black-box constrained optimization where both the objective function and the constraints are unknown and can only be observed pointwise. Both deterministic and probabilistic surrogate models are considered: the latter, more specifically analysed, are based on Gaussian Processes and Bayesian Optimization to handle the exploration–exploitation dilemma and improve sample efficiency. Particularly challenging is the case when the feasible region might be disconnected and the objective function cannot be evaluated outside the feasible region; this situation, known as “partially defined objective function” or “non-computable domains”, requires a novel approach: a first phase is based on the SVM classification in order to learn the feasible region, and a second phase, optimization, is based on a Gaussian Process. This approach is the main focus of this chapter that analyses modelling and computational issues and demonstrates the sample efficiency of the resulting algorithms.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Literatur
1.
Zurück zum Zitat Adam, S.P., Alexandropoulus S.A.N., Pardalos, P., Vrahatis, M.: No free lunch theorem: A review. In: Approximation and Optimization, pp. 57–82. Springer, Berlin (2019) Adam, S.P., Alexandropoulus S.A.N., Pardalos, P., Vrahatis, M.: No free lunch theorem: A review. In: Approximation and Optimization, pp. 57–82. Springer, Berlin (2019)
2.
Zurück zum Zitat Akimoto, Y., Auger, A., Hansen, N.: CMA-ES and advanced adaptation mechanisms. In: Proceedings of the 2016 on Genetic and Evolutionary Computation Conference Companion, pp. 533–562. ACM, New York (2016) Akimoto, Y., Auger, A., Hansen, N.: CMA-ES and advanced adaptation mechanisms. In: Proceedings of the 2016 on Genetic and Evolutionary Computation Conference Companion, pp. 533–562. ACM, New York (2016)
3.
Zurück zum Zitat Alexandropoulos, S.A.N., Aridas, C.K., Kotsiantis, S.B., Vrahatis, M.N.: Multi-objective evolutionary optimization algorithms for machine learning: A recent survey. In: Approximation and Optimization, pp. 35–55. Springer, Cham (2019) Alexandropoulos, S.A.N., Aridas, C.K., Kotsiantis, S.B., Vrahatis, M.N.: Multi-objective evolutionary optimization algorithms for machine learning: A recent survey. In: Approximation and Optimization, pp. 35–55. Springer, Cham (2019)
4.
Zurück zum Zitat Amaran, S., Sahinidis, N.V., Sharda, B., Bury, S.J.: Simulation optimization: a review of algorithms and applications. Ann. Operat. Res. 240(1), 351–380 (2016)MathSciNetMATHCrossRef Amaran, S., Sahinidis, N.V., Sharda, B., Bury, S.J.: Simulation optimization: a review of algorithms and applications. Ann. Operat. Res. 240(1), 351–380 (2016)MathSciNetMATHCrossRef
6.
Zurück zum Zitat Archetti, F., Candelieri, A.: Bayesian Optimization and Data Science. SpringerBriefs in Optimization. Springer International Publishing, New York (2019)MATHCrossRef Archetti, F., Candelieri, A.: Bayesian Optimization and Data Science. SpringerBriefs in Optimization. Springer International Publishing, New York (2019)MATHCrossRef
7.
Zurück zum Zitat Auer, P. (2002). Using confidence bounds for exploitation-exploration trade-offs. Journal of Machine Learning Research, 3(Nov), 397–422MathSciNetMATH Auer, P. (2002). Using confidence bounds for exploitation-exploration trade-offs. Journal of Machine Learning Research, 3(Nov), 397–422MathSciNetMATH
8.
Zurück zum Zitat Bachoc, F., Helbert, C., Picheny, V.: Gaussian process optimization with failures: classification and convergence proof. J. Global Optim. 78, 483–506 (2019). hal.archives-ouvertes.fr Bachoc, F., Helbert, C., Picheny, V.: Gaussian process optimization with failures: classification and convergence proof. J. Global Optim. 78, 483–506 (2019). hal.archives-ouvertes.fr
9.
Zurück zum Zitat Basudhar, A., Dribusch, C., Lacaze, S., Missoum, S.: Constrained efficient global optimization with support vector machines. Struct. Multidiscipl. Optim. 46(2), 201–221 (2012)MATHCrossRef Basudhar, A., Dribusch, C., Lacaze, S., Missoum, S.: Constrained efficient global optimization with support vector machines. Struct. Multidiscipl. Optim. 46(2), 201–221 (2012)MATHCrossRef
10.
Zurück zum Zitat Bernardo, J., Bayarri, M., Berger, J., Dawid, A., Heckerman, D., Smith, A., West, M.: Optimization under unknown constraints. Bayesian Stat. 9(9), 229 (2011)MathSciNet Bernardo, J., Bayarri, M., Berger, J., Dawid, A., Heckerman, D., Smith, A., West, M.: Optimization under unknown constraints. Bayesian Stat. 9(9), 229 (2011)MathSciNet
11.
Zurück zum Zitat Bhosekar, A., Ierapetritou, M.: Advances in surrogate based modeling, feasibility analysis, and optimization: a review. Comput. Chem. Eng. 108, 250–267 (2018)CrossRef Bhosekar, A., Ierapetritou, M.: Advances in surrogate based modeling, feasibility analysis, and optimization: a review. Comput. Chem. Eng. 108, 250–267 (2018)CrossRef
12.
Zurück zum Zitat Bouhlel, A.M., Bartoli, N., Regis, R.G., Otsmane, A., Morlier, J.: Efficient global optimization for high-dimensional constrained problems by using the Kriging models combined with the partial least squares method. Eng. Optim. 50(12), 2038–2053 (2018)MathSciNetCrossRef Bouhlel, A.M., Bartoli, N., Regis, R.G., Otsmane, A., Morlier, J.: Efficient global optimization for high-dimensional constrained problems by using the Kriging models combined with the partial least squares method. Eng. Optim. 50(12), 2038–2053 (2018)MathSciNetCrossRef
13.
Zurück zum Zitat Box G. E. P.; Draper, N. R. (2007), Response Surfaces, Mixtures, and Ridge Analyses, John Wiley & Sons. pg. 414 Box G. E. P.; Draper, N. R. (2007), Response Surfaces, Mixtures, and Ridge Analyses, John Wiley & Sons. pg. 414
14.
Zurück zum Zitat Candelieri, A., Archetti, F.: Sequential model-based optimization with black-box constraints: Feasibility determination via machine learning. In: AIP Conference Proceedings, p. 020010 (2019) Candelieri, A., Archetti, F.: Sequential model-based optimization with black-box constraints: Feasibility determination via machine learning. In: AIP Conference Proceedings, p. 020010 (2019)
15.
Zurück zum Zitat Candelieri, A., Perego, R., Archetti, F.: Bayesian optimization of pump operations in water distribution systems. J. Global Optim. 71, 1–23 (2018)MathSciNetMATHCrossRef Candelieri, A., Perego, R., Archetti, F.: Bayesian optimization of pump operations in water distribution systems. J. Global Optim. 71, 1–23 (2018)MathSciNetMATHCrossRef
16.
Zurück zum Zitat Cao, Y., Shen, Y.: Bayesian active learning for optimization and uncertainty quantification in protein docking (2019). Preprint arXiv:1902.00067 Cao, Y., Shen, Y.: Bayesian active learning for optimization and uncertainty quantification in protein docking (2019). Preprint arXiv:1902.00067
17.
Zurück zum Zitat Chen, Y., Hoffman, M. W., Colmenarejo, S. G., Denil, M., Lillicrap, T.P., de Freitas, N.: Learning to learn for global optimization of black box functions (2016). Preprint arXiv:1611.03824 Chen, Y., Hoffman, M. W., Colmenarejo, S. G., Denil, M., Lillicrap, T.P., de Freitas, N.: Learning to learn for global optimization of black box functions (2016). Preprint arXiv:1611.03824
18.
Zurück zum Zitat Costabal, F.S., Perdikaris, P., Kuhl, E., Hurtado, D.E.: Multi-fidelity classification using Gaussian processes: accelerating the prediction of large-scale computational models (2019). Preprint arXiv:1905.03406 Costabal, F.S., Perdikaris, P., Kuhl, E., Hurtado, D.E.: Multi-fidelity classification using Gaussian processes: accelerating the prediction of large-scale computational models (2019). Preprint arXiv:1905.03406
19.
Zurück zum Zitat Costabal, F.S., Yao, J., Sher, A., Kuhl, E.: Predicting critical drug concentrations and torsadogenic risk using a multiscale exposure-response simulator. Progress Biophys. Molecular Biolo. 144, 61–76 (2019)CrossRef Costabal, F.S., Yao, J., Sher, A., Kuhl, E.: Predicting critical drug concentrations and torsadogenic risk using a multiscale exposure-response simulator. Progress Biophys. Molecular Biolo. 144, 61–76 (2019)CrossRef
20.
Zurück zum Zitat Cozad, A., Sahinidis, N.V., Miller, D.C.: Learning surrogate models for simulation-based optimization. AIChE J. 60(6), 2211–2227 (2014)CrossRef Cozad, A., Sahinidis, N.V., Miller, D.C.: Learning surrogate models for simulation-based optimization. AIChE J. 60(6), 2211–2227 (2014)CrossRef
21.
Zurück zum Zitat Digabel, S.L., Wild, S.M.: A taxonomy of constraints in simulation-based optimization (2015). Preprint arXiv:1505.07881 Digabel, S.L., Wild, S.M.: A taxonomy of constraints in simulation-based optimization (2015). Preprint arXiv:1505.07881
22.
Zurück zum Zitat Dong, H., Song, B., Dong, Z., Wang, P.: SCGOSR: surrogate-based constrained global optimization using space reduction. Appl. Soft Comput. 65, 462–477 (2018)CrossRef Dong, H., Song, B., Dong, Z., Wang, P.: SCGOSR: surrogate-based constrained global optimization using space reduction. Appl. Soft Comput. 65, 462–477 (2018)CrossRef
23.
Zurück zum Zitat Eggensperger, K., Lindauer, M., Hutter, F.: Pitfalls and best practices in algorithm configuration. J. Artif. Intell. Res. 64, 861–893 (2019)MathSciNetMATHCrossRef Eggensperger, K., Lindauer, M., Hutter, F.: Pitfalls and best practices in algorithm configuration. J. Artif. Intell. Res. 64, 861–893 (2019)MathSciNetMATHCrossRef
24.
Zurück zum Zitat Feliot, P., Bect, J., Vazquez, E.: A Bayesian approach to constrained single-and multi-objective optimization. J. Global Optim. 67(1–2), 97–133 (2017)MathSciNetMATHCrossRef Feliot, P., Bect, J., Vazquez, E.: A Bayesian approach to constrained single-and multi-objective optimization. J. Global Optim. 67(1–2), 97–133 (2017)MathSciNetMATHCrossRef
25.
Zurück zum Zitat Frazier, P.I., Powell, W.B., Dayanik, S.: A knowledge-gradient policy for sequential information collection. SIAM J. Control Optim. 47(5), 2410–2439 (2008)MathSciNetMATHCrossRef Frazier, P.I., Powell, W.B., Dayanik, S.: A knowledge-gradient policy for sequential information collection. SIAM J. Control Optim. 47(5), 2410–2439 (2008)MathSciNetMATHCrossRef
26.
Zurück zum Zitat Gardner, J.R., Kusner, M.J., Xu, Z.E., Weinberger, K.Q., Cunningham, J.P.: Bayesian optimization with inequality constraints. In: International Conference on Machine Learning, pp. 937–945 (2014) Gardner, J.R., Kusner, M.J., Xu, Z.E., Weinberger, K.Q., Cunningham, J.P.: Bayesian optimization with inequality constraints. In: International Conference on Machine Learning, pp. 937–945 (2014)
27.
Zurück zum Zitat Garnett, R., Osborne, M.A., Roberts, S.J.: Sequential Bayesian prediction in the presence of changepoints. In: Proceedings of the 26th Annual International Conference on Machine Learning, pp. 345–352. ACM, New York (2009) Garnett, R., Osborne, M.A., Roberts, S.J.: Sequential Bayesian prediction in the presence of changepoints. In: Proceedings of the 26th Annual International Conference on Machine Learning, pp. 345–352. ACM, New York (2009)
28.
Zurück zum Zitat Gaviano, M., Kvasov, D.E., Lera, D., Sergeyev, Ya.D.: Algorithm 829: Software for generation of classes of test functions with known local and global minima for global optimization. ACM Trans. Math. Softw. 29(4), 469–480 (2003) Gaviano, M., Kvasov, D.E., Lera, D., Sergeyev, Ya.D.: Algorithm 829: Software for generation of classes of test functions with known local and global minima for global optimization. ACM Trans. Math. Softw. 29(4), 469–480 (2003)
29.
Zurück zum Zitat Gergel, V., Barkalov, K., Lebedev, I., Rachinskaya, M., Sysoyev, A.: A flexible generator of constrained global optimization test problems. In: AIP Conference Proceedings, vol. 2070, no. 1, p. 020009. AIP Publishing, College Park (2019) Gergel, V., Barkalov, K., Lebedev, I., Rachinskaya, M., Sysoyev, A.: A flexible generator of constrained global optimization test problems. In: AIP Conference Proceedings, vol. 2070, no. 1, p. 020009. AIP Publishing, College Park (2019)
30.
Zurück zum Zitat Ghoreishi, S.F., Allaire, D.: Multi-information source constrained Bayesian optimization. Struct. Multidiscip. Optim. 59, 1–15 (2018)MathSciNet Ghoreishi, S.F., Allaire, D.: Multi-information source constrained Bayesian optimization. Struct. Multidiscip. Optim. 59, 1–15 (2018)MathSciNet
32.
Zurück zum Zitat Gramacy, R.B., Gray, G.A., Le Digabel, S., Lee, H.K., Ranjan, P., Wells, G., Wild, S.M.: Modeling an augmented Lagrangian for blackbox constrained optimization. Technometrics 58(1), 1–11 (2016)MathSciNetCrossRef Gramacy, R.B., Gray, G.A., Le Digabel, S., Lee, H.K., Ranjan, P., Wells, G., Wild, S.M.: Modeling an augmented Lagrangian for blackbox constrained optimization. Technometrics 58(1), 1–11 (2016)MathSciNetCrossRef
33.
Zurück zum Zitat Grishagin, V., Israfilov, R.: Multidimensional constrained global optimization in domains with computable boundaries. In: CEUR Workshop Proceedings. Vol. 1513: Proceedings of the 1st Ural Workshop on Parallel, Distributed, and Cloud Computing for Young Scientists (Ural-PDC 2015).—Yekaterinburg, 2015 (2015) Grishagin, V., Israfilov, R.: Multidimensional constrained global optimization in domains with computable boundaries. In: CEUR Workshop Proceedings. Vol. 1513: Proceedings of the 1st Ural Workshop on Parallel, Distributed, and Cloud Computing for Young Scientists (Ural-PDC 2015).—Yekaterinburg, 2015 (2015)
34.
Zurück zum Zitat Hernández-Lobato, J.M., Gelbart, M.A., Hoffman, M.W., Adams, R.P., Ghahramani, Z.: Predictive entropy search for Bayesian optimization with unknown constraints. In: 32nd International Conference on Machine Learning, ICML 2015, pp. 1699–1707. International Machine Learning Society (IMLS) (2015) Hernández-Lobato, J.M., Gelbart, M.A., Hoffman, M.W., Adams, R.P., Ghahramani, Z.: Predictive entropy search for Bayesian optimization with unknown constraints. In: 32nd International Conference on Machine Learning, ICML 2015, pp. 1699–1707. International Machine Learning Society (IMLS) (2015)
35.
Zurück zum Zitat Hu, W., Fathi, M., Pardalos, P.M.: A multi-objective evolutionary algorithm based on decomposition and constraint programming for the multi-objective team orienteering problem with time windows. Appl. Soft Comput. 73, 383–393 (2018)CrossRef Hu, W., Fathi, M., Pardalos, P.M.: A multi-objective evolutionary algorithm based on decomposition and constraint programming for the multi-objective team orienteering problem with time windows. Appl. Soft Comput. 73, 383–393 (2018)CrossRef
36.
Zurück zum Zitat Huyer, W., Neumaier, A.: SNOBFIT-stable noisy optimization by branch and fit. ACM Trans. Math. Softw. 35(2), 200 (2006)MathSciNet Huyer, W., Neumaier, A.: SNOBFIT-stable noisy optimization by branch and fit. ACM Trans. Math. Softw. 35(2), 200 (2006)MathSciNet
37.
Zurück zum Zitat Ilievski, I., Akhtar, T., Feng, J., Shoemaker, C.A.: Efficient hyperparameter optimization for deep learning algorithms using deterministic RBF surrogates. In: Thirty-First AAAI Conference on Artificial Intelligence (2017) Ilievski, I., Akhtar, T., Feng, J., Shoemaker, C.A.: Efficient hyperparameter optimization for deep learning algorithms using deterministic RBF surrogates. In: Thirty-First AAAI Conference on Artificial Intelligence (2017)
38.
Zurück zum Zitat Jain, P., Kar, P.: Non-convex optimization for machine learning. Found. Trends Mach. Learn. 10(3–4), 142–336 (2017)MATHCrossRef Jain, P., Kar, P.: Non-convex optimization for machine learning. Found. Trends Mach. Learn. 10(3–4), 142–336 (2017)MATHCrossRef
39.
40.
Zurück zum Zitat Jones, D.R.: Large-scale multi-disciplinary mass optimization in the auto industry. In: MOPTA 2008 Conference (2008) Jones, D.R.: Large-scale multi-disciplinary mass optimization in the auto industry. In: MOPTA 2008 Conference (2008)
41.
Zurück zum Zitat Jones, D.R., Schonlau, M., Welch, W.J.: Efficient global optimization of expensive black-box functions. J. Global Optim. 13(4), 455–492 (1998)MathSciNetMATHCrossRef Jones, D.R., Schonlau, M., Welch, W.J.: Efficient global optimization of expensive black-box functions. J. Global Optim. 13(4), 455–492 (1998)MathSciNetMATHCrossRef
42.
Zurück zum Zitat Kandasamy, K., Dasarathy, G., Schneider, J., Poczos, B.: Multi-fidelity Bayesian optimisation with continuous approximations. In Proceedings of the 34th International Conference on Machine Learning, vol. 70, pp. 1799–1808 (2017). https://JMLR.org Kandasamy, K., Dasarathy, G., Schneider, J., Poczos, B.: Multi-fidelity Bayesian optimisation with continuous approximations. In Proceedings of the 34th International Conference on Machine Learning, vol. 70, pp. 1799–1808 (2017). https://​JMLR.​org
43.
Zurück zum Zitat Kleijnen J.P.C.: Kriging: Methods and Applications. CentER Discussion Paper Series No. 2017-047 (2017) Kleijnen J.P.C.: Kriging: Methods and Applications. CentER Discussion Paper Series No. 2017-047 (2017)
44.
Zurück zum Zitat Klein, A., Falkner, S., Springenberg, J.T., Hutter, F.: Learning curve prediction with Bayesian neural networks. In: Published as a Conference Paper at ICLR 2017 (2016) Klein, A., Falkner, S., Springenberg, J.T., Hutter, F.: Learning curve prediction with Bayesian neural networks. In: Published as a Conference Paper at ICLR 2017 (2016)
45.
Zurück zum Zitat Koch, P., Bagheri, S., Konen, W., Foussette, C., Krause, P., Bäck, T.: A new repair method for constrained optimization. In: Proceedings of the 2015 Annual Conference on Genetic and Evolutionary Computation, pp. 273–280. ACM, New York (2015) Koch, P., Bagheri, S., Konen, W., Foussette, C., Krause, P., Bäck, T.: A new repair method for constrained optimization. In: Proceedings of the 2015 Annual Conference on Genetic and Evolutionary Computation, pp. 273–280. ACM, New York (2015)
46.
Zurück zum Zitat Kushner, H.J.: A new method of locating the maximum point of an arbitrary multipeak curve in the presence of noise. J. Basic Eng. 86(1), 97–106 (1964)CrossRef Kushner, H.J.: A new method of locating the maximum point of an arbitrary multipeak curve in the presence of noise. J. Basic Eng. 86(1), 97–106 (1964)CrossRef
47.
Zurück zum Zitat Lam, R., Willcox, K., Wolpert, D.H.: Bayesian optimization with a finite budget: An approximate dynamic programming approach. In: Advances in Neural Information Processing Systems, pp. 883–891 (2016) Lam, R., Willcox, K., Wolpert, D.H.: Bayesian optimization with a finite budget: An approximate dynamic programming approach. In: Advances in Neural Information Processing Systems, pp. 883–891 (2016)
48.
Zurück zum Zitat Larson, J., Menickelly, M., Wild, S.M.: Derivative-free optimization methods (2019). Preprint arXiv:1904.11585 Larson, J., Menickelly, M., Wild, S.M.: Derivative-free optimization methods (2019). Preprint arXiv:1904.11585
49.
Zurück zum Zitat Letham, B., Karrer, B., Ottoni, G., Bakshy, E. (2017). Constrained Bayesian optimization with noisy experiments. arXiv preprint arXiv:1706.07094. Letham, B., Karrer, B., Ottoni, G., Bakshy, E. (2017). Constrained Bayesian optimization with noisy experiments. arXiv preprint arXiv:1706.07094.
50.
Zurück zum Zitat Letham, B., Karrer, B., Ottoni, G., Bakshy, E.: Constrained Bayesian optimization with noisy experiments. Bayesian Anal. 14(2), 495–519 (2019)MathSciNetMATHCrossRef Letham, B., Karrer, B., Ottoni, G., Bakshy, E.: Constrained Bayesian optimization with noisy experiments. Bayesian Anal. 14(2), 495–519 (2019)MathSciNetMATHCrossRef
51.
Zurück zum Zitat Martì, R., Pardalos, P.M., Resende, M.G. (Eds.): Handbook of Heuristics. Springer, Berlin (2018)MATH Martì, R., Pardalos, P.M., Resende, M.G. (Eds.): Handbook of Heuristics. Springer, Berlin (2018)MATH
52.
Zurück zum Zitat Mehdad, E., Kleijnen, J.P.: Efficient global optimisation for black-box simulation via sequential intrinsic Kriging. J. Oper. Res. Soc. 69(11), 1725–1737 (2018)CrossRef Mehdad, E., Kleijnen, J.P.: Efficient global optimisation for black-box simulation via sequential intrinsic Kriging. J. Oper. Res. Soc. 69(11), 1725–1737 (2018)CrossRef
53.
Zurück zum Zitat Mockus, J., Tiesis, V., Zilinskas, A.: The application of Bayesian methods for seeking the extremum. Towards global optimization, 2 (117–129), 2, Dixon, L.C.W., Szego, G.P. (eds.) (1978) Mockus, J., Tiesis, V., Zilinskas, A.: The application of Bayesian methods for seeking the extremum. Towards global optimization, 2 (117–129), 2, Dixon, L.C.W., Szego, G.P. (eds.) (1978)
54.
Zurück zum Zitat Moreno, J.D., Zhu, Z.I., Yang, P.C., Bankston, J.R., Jeng, M.T., Kang, C., Wang, L., Bayer, J.D., Christini, D.J., Trayanova, N.A., Ripplinger, C.M., Kass, R.S., Clancy, C.E.: A computational model to predict the effects of class I anti-arrhythmic drugs on ventricular rhythms. Sci. Transl. Med. 3(98), 98ra83 (2011) Moreno, J.D., Zhu, Z.I., Yang, P.C., Bankston, J.R., Jeng, M.T., Kang, C., Wang, L., Bayer, J.D., Christini, D.J., Trayanova, N.A., Ripplinger, C.M., Kass, R.S., Clancy, C.E.: A computational model to predict the effects of class I anti-arrhythmic drugs on ventricular rhythms. Sci. Transl. Med. 3(98), 98ra83 (2011)
55.
Zurück zum Zitat Nuñez, L., Regis, R.G., Varela, K.: Accelerated random search for constrained global optimization assisted by radial basis function surrogates. J. Comput. Appl. Math. 340, 276–295 (2018)MathSciNetMATHCrossRef Nuñez, L., Regis, R.G., Varela, K.: Accelerated random search for constrained global optimization assisted by radial basis function surrogates. J. Comput. Appl. Math. 340, 276–295 (2018)MathSciNetMATHCrossRef
56.
Zurück zum Zitat Ortega, P.A., Wang, J.X., Rowland, M., Genewein, T., Kurth-Nelson, Z., Pascanu, R., et al.: Meta-learning of Sequential Strategies (2019). Preprint arXiv:1905.03030 Ortega, P.A., Wang, J.X., Rowland, M., Genewein, T., Kurth-Nelson, Z., Pascanu, R., et al.: Meta-learning of Sequential Strategies (2019). Preprint arXiv:1905.03030
58.
Zurück zum Zitat Peherstorfer, B., Willcox, K., Gunzburger, M.: Survey of multifidelity methods in uncertainty propagation, inference, and optimization. SIAM Rev. 60(3), 550–591 (2018)MathSciNetMATHCrossRef Peherstorfer, B., Willcox, K., Gunzburger, M.: Survey of multifidelity methods in uncertainty propagation, inference, and optimization. SIAM Rev. 60(3), 550–591 (2018)MathSciNetMATHCrossRef
59.
Zurück zum Zitat Perdikaris, P., Venturi, D., Karniadakis, G.E.: Multifidelity information fusion algorithms for high-dimensional systems and massive data sets. SIAM J. Sci. Comput. 38(4), B521–B538 (2016)MathSciNetMATHCrossRef Perdikaris, P., Venturi, D., Karniadakis, G.E.: Multifidelity information fusion algorithms for high-dimensional systems and massive data sets. SIAM J. Sci. Comput. 38(4), B521–B538 (2016)MathSciNetMATHCrossRef
61.
Zurück zum Zitat Regis, R.G.: A Survey of Surrogate Approaches for Expensive Constrained Black-Box Optimization. In: Le Thi, H., Le, H., Pham Dinh, T. (eds.) Optimization of Complex Systems: Theory, Models, Algorithms and Applications, pp. 37–47. WCGO 2019. Springer, Cham (2020) Regis, R.G.: A Survey of Surrogate Approaches for Expensive Constrained Black-Box Optimization. In: Le Thi, H., Le, H., Pham Dinh, T. (eds.) Optimization of Complex Systems: Theory, Models, Algorithms and Applications, pp. 37–47. WCGO 2019. Springer, Cham (2020)
62.
Zurück zum Zitat Regis, R.G., Shoemaker, C.A.: Constrained global optimization of expensive black box functions using radial basis functions, J. Global Optim. 31, 153–171 (2005)MathSciNetMATHCrossRef Regis, R.G., Shoemaker, C.A.: Constrained global optimization of expensive black box functions using radial basis functions, J. Global Optim. 31, 153–171 (2005)MathSciNetMATHCrossRef
63.
Zurück zum Zitat Regis, R.G., Shoemaker, C.A.: A stochastic radial basis function method for the global optimization of expensive functions. INFORMS J. Comput. 19(4), 497–509 (2007)MathSciNetMATHCrossRef Regis, R.G., Shoemaker, C.A.: A stochastic radial basis function method for the global optimization of expensive functions. INFORMS J. Comput. 19(4), 497–509 (2007)MathSciNetMATHCrossRef
64.
Zurück zum Zitat Regis, R.G., Shoemaker, C.A.: Parallel radial basis function methods for the global optimization of expensive functions. Eur. J. Oper. Res. 182(2), 514–535 (2007)MathSciNetMATHCrossRef Regis, R.G., Shoemaker, C.A.: Parallel radial basis function methods for the global optimization of expensive functions. Eur. J. Oper. Res. 182(2), 514–535 (2007)MathSciNetMATHCrossRef
65.
Zurück zum Zitat Regis, R.G., Shoemaker, C.A.: Combining radial basis function surrogates and dynamic coordinate search in high-dimensional expensive black-box optimization. Eng. Optim. 45(5), 529–555 (2013)MathSciNetCrossRef Regis, R.G., Shoemaker, C.A.: Combining radial basis function surrogates and dynamic coordinate search in high-dimensional expensive black-box optimization. Eng. Optim. 45(5), 529–555 (2013)MathSciNetCrossRef
66.
Zurück zum Zitat Rudenko, L.I.: Objective functional approximation in a partially defined optimization problem. J. Math. Sci. 72(5), 3359–3363 (1994)MathSciNetCrossRef Rudenko, L.I.: Objective functional approximation in a partially defined optimization problem. J. Math. Sci. 72(5), 3359–3363 (1994)MathSciNetCrossRef
67.
Zurück zum Zitat Sacher, M., Duvigneau, R., Le Maitre, O., Durand, M., Berrini, E., Hauville, F., Astolfi, J.A.: A classification approach to efficient global optimization in presence of non-computable domains. Struct. Multidiscip. Optim. 58(4), 1537–1557 (2018)MathSciNetCrossRef Sacher, M., Duvigneau, R., Le Maitre, O., Durand, M., Berrini, E., Hauville, F., Astolfi, J.A.: A classification approach to efficient global optimization in presence of non-computable domains. Struct. Multidiscip. Optim. 58(4), 1537–1557 (2018)MathSciNetCrossRef
68.
Zurück zum Zitat Sen, S., Deng, Y.: Learning enabled optimization: Towards a fusion of statistical learning and stochastic programming. INFORMS Journal on Optimization (2018) Sen, S., Deng, Y.: Learning enabled optimization: Towards a fusion of statistical learning and stochastic programming. INFORMS Journal on Optimization (2018)
69.
Zurück zum Zitat Sergeyev, Y.D., Kvasov, D.E., Khalaf, F.M.: A one-dimensional local tuning algorithm for solving GO problems with partially defined constraints. Optim. Lett. 1(1), 85–99 (2007)MathSciNetMATHCrossRef Sergeyev, Y.D., Kvasov, D.E., Khalaf, F.M.: A one-dimensional local tuning algorithm for solving GO problems with partially defined constraints. Optim. Lett. 1(1), 85–99 (2007)MathSciNetMATHCrossRef
70.
Zurück zum Zitat Sergeyev, Y.D., Kvasov, D.E., Mukhametzhanov, M.S.: Emmental-type GKLS-based multiextremal smooth test problems with non-linear constraints. In: International Conference on Learning and Intelligent Optimization, pp. 383–388. Springer, Cham (2017) Sergeyev, Y.D., Kvasov, D.E., Mukhametzhanov, M.S.: Emmental-type GKLS-based multiextremal smooth test problems with non-linear constraints. In: International Conference on Learning and Intelligent Optimization, pp. 383–388. Springer, Cham (2017)
71.
Zurück zum Zitat Sra, S., Nowozin, S., Wright, S.J. (Eds.): Optimization for Machine Learning. Mit Press, Cambridge (2012)MATH Sra, S., Nowozin, S., Wright, S.J. (Eds.): Optimization for Machine Learning. Mit Press, Cambridge (2012)MATH
72.
Zurück zum Zitat Srinivas, N., Krause, A., Kakade, S. M., & Seeger, M. W. (2012). Information-theoretic regret bounds for gaussian process optimization in the bandit setting. IEEE Transactions on Information Theory, 58(5), 3250–3265MathSciNetMATHCrossRef Srinivas, N., Krause, A., Kakade, S. M., & Seeger, M. W. (2012). Information-theoretic regret bounds for gaussian process optimization in the bandit setting. IEEE Transactions on Information Theory, 58(5), 3250–3265MathSciNetMATHCrossRef
73.
Zurück zum Zitat Sui, Y., Gotovos, A., Burdick, J., & Krause, A. (2015, June). Safe exploration for optimization with Gaussian processes. In International Conference on Machine Learning (pp. 997–1005). PMLR Sui, Y., Gotovos, A., Burdick, J., & Krause, A. (2015, June). Safe exploration for optimization with Gaussian processes. In International Conference on Machine Learning (pp. 997–1005). PMLR
74.
Zurück zum Zitat Tsai, Y.A., Pedrielli, G., Mathesen, L., Zabinsky, Z.B., Huang, H., Candelieri, A., Perego, R.: Stochastic optimization for feasibility determination: An application to water pump operation in water distribution network. In: Proceedings of the 2018 Winter Simulation Conference, pp. 1945–1956. IEEE Press, New York (2018) Tsai, Y.A., Pedrielli, G., Mathesen, L., Zabinsky, Z.B., Huang, H., Candelieri, A., Perego, R.: Stochastic optimization for feasibility determination: An application to water pump operation in water distribution network. In: Proceedings of the 2018 Winter Simulation Conference, pp. 1945–1956. IEEE Press, New York (2018)
75.
Zurück zum Zitat Volpp, M., Fr’́ohlich, L., Doerr, A., Hutter, F., Daniel, C.: Meta-Learning Acquisition Functions for Bayesian Optimization (2019). Preprint arXiv:1904.02642 Volpp, M., Fr’́ohlich, L., Doerr, A., Hutter, F., Daniel, C.: Meta-Learning Acquisition Functions for Bayesian Optimization (2019). Preprint arXiv:1904.02642
76.
Zurück zum Zitat Wang, Y., Shoemaker, C.A.: A General Stochastic Algorithmic Framework for Minimizing Expensive Black Box Objective Functions Based on Surrogate Models and Sensitivity Analysis (2014). Preprint arXiv:1410.6271 Wang, Y., Shoemaker, C.A.: A General Stochastic Algorithmic Framework for Minimizing Expensive Black Box Objective Functions Based on Surrogate Models and Sensitivity Analysis (2014). Preprint arXiv:1410.6271
77.
Zurück zum Zitat Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning, vol. 2, No. 3, p. 4. MIT Press, Cambridge (2006) Williams, C.K., Rasmussen, C.E.: Gaussian Processes for Machine Learning, vol. 2, No. 3, p. 4. MIT Press, Cambridge (2006)
78.
Zurück zum Zitat Wilson, Z.T., Sahinidis, N.V.: The ALAMO approach to machine learning. Comput. Chem. Eng. 106, 785–795 (2017)CrossRef Wilson, Z.T., Sahinidis, N.V.: The ALAMO approach to machine learning. Comput. Chem. Eng. 106, 785–795 (2017)CrossRef
79.
Zurück zum Zitat Wilson, Z.T., Sahinidis, N.V.: The ALAMO approach to machine learning. Comput. Chem. Eng. 106, 785–795 (2017)CrossRef Wilson, Z.T., Sahinidis, N.V.: The ALAMO approach to machine learning. Comput. Chem. Eng. 106, 785–795 (2017)CrossRef
80.
Zurück zum Zitat Zabinsky, Z.B.: Stochastic Adaptive Search for Global Optimization, vol. 72. Springer Science & Business Media, Berlin (2013)MATH Zabinsky, Z.B.: Stochastic Adaptive Search for Global Optimization, vol. 72. Springer Science & Business Media, Berlin (2013)MATH
81.
Zurück zum Zitat Zhang, Z., Buisson, M., Ferrand, P., Henner, M.: Databases coupling for morphed-mesh simulations and application on fan optimal design. In: World Congress on Global Optimization, pp. 981–990. Springer, Cham (2019) Zhang, Z., Buisson, M., Ferrand, P., Henner, M.: Databases coupling for morphed-mesh simulations and application on fan optimal design. In: World Congress on Global Optimization, pp. 981–990. Springer, Cham (2019)
82.
Zurück zum Zitat Ẑilinskas, A., Zhigljavsky, A.: Stochastic global optimization: a review on the occasion of 25 years of Informatica. Informatica 27(2), 229–256 (2016) Ẑilinskas, A., Zhigljavsky, A.: Stochastic global optimization: a review on the occasion of 25 years of Informatica. Informatica 27(2), 229–256 (2016)
Metadaten
Titel
Learning Enabled Constrained Black-Box Optimization
verfasst von
F. Archetti
A. Candelieri
B. G. Galuzzi
R. Perego
Copyright-Jahr
2021
DOI
https://doi.org/10.1007/978-3-030-66515-9_1

Premium Partner