Abstract
In spite of their ability to deal with difficult optimization problems, genetic algorithms in general require a large number of evaluations in order to find an optimal or a satisfactory near-optimal solution. When expensive simulations are involved in the optimization process, using genetic algorithms as optimization tools can become unattractive. The use of surrogate models is an interesting alternative to overcome that drawback, either by replacing expensive evaluations or allowing for exploration of the search space. In this paper we introduce a surrogate model based on similarity measures into genetic algorithms in order to enhance the performance in optimization problems, under a fixed budget of simulations. Numerical experiments are conducted in order to assess the applicability and the performance in constrained and unconstrained optimization problems. The results show that the present framework arises as an attractive alternative to improve the final solutions with a fixed budget of expensive evaluations
Similar content being viewed by others
References
Goldberg, D.: Genetic algorithms in search, optimization and machine learning. Addison-Wesley Publishing Co. Reading, Mass., USA. (1989)
Holland, J.H.: Adaptation in Natural and Artificial Systems. University of Michigan Press (1975)
Barbosa, H., Lemonge, A.: A new adaptive penalty scheme for genetic algorithms. Inf. Sci. 156(3–4), 215–251 (2003)
Branke, J., Schmidt, C.: Faster convergence by means of fitness estimation. Soft Computing 9(1), 13–20 (2005)
Bull, L.: On model-based evolutionary computation. Soft Computing 3(2), 76–82 (1999)
Jin, Y.: A comprehensive survey of fitness approximation in evolutionary computation. Soft Computing Journal 9(1), 3–12 (2005)
Liang, K.H., Yao, X., Newton, C.: Evolutionary search of approximated N-dimensional landscapes. International Journal of Knowledge- Based Intelligent Engineering Systems 4(3), 172–183 (July 2000)
Queipo, N.V., Haftka, R.T., Shyy, W., Goela, T., Vaidyanathana, R., Tucker, P.K.: Surrogate-based analysis and optimization. Progress in Aerospace Sciences 41(1), 1–28 (2005)
Zhou, Z., Ong, Y.S., Nair, P.B.: Hierarchical surrogate-assisted evolutionary optimization framework. In: Congress on Evolutionary Computation, IEEE 1586–1593 (2004)
Myers, R.H., Montgomery, D.C.: Response surface methodology — process and product optimization using designed experiments. Wiley series in probability and statistics. John Wiley & Sons Inc., New York (2002)
Kröse, B.J.A., van der Smagt, P.P.: An Introduction to neural networks. University of Amsterdam, Amsterdam (1993)
van Beers, W.C.M., Kleijnen, J.P.C.: Kriging interpolation in simulation: a survey. In: WSC’ 04: Proceedings of the 36th Conference on Winter Simulation, Winter Simulation Conference 113–121 (2004)
Kybic, J., Blu, T., Unser, M.: Generalized sampling; a variational approach — Part I: Theory. IEEE Transactions on Signal Processing 50(8) 1965–1976
Kybic, J., Blu, T., Unser, M.: Generalized sampling; a variational approach — Part II: Applications. IEEE Transactions on Signal Processing 50(8) 1977–1985
Kecman, V.: Learning and soft computing: support vector machines, neural networks, and fuzzy logic models. Complex adaptive systems. MIT Press, Cambridge, MA, USA (2001)
Grefenstette, J., Fitzpatrick, J.: Genetic search with approximate fitness evaluations. In: Proceedings of the International Conference on Genetic Algorithms and Their Applications. 112–120 (1985)
Deb, K.: An efficient constraint handling method for genetic algorithms. Computer Methods in Applied Mechanics and Engineering 186(2/4) 311–338 (2000)
Hendrickx, W., Gorissen, D., Dhaene, T.: Grid enabled sequential design and adaptive metamodeling. In: WSC’ 06: Proceedings of the 37th conference on Winter simulation, Winter Simulation Conference 872–881 (2006)
Barton, R.R.: Metamodels for simulation input-output relations. In: WSC’ 92: Proceedings of the 24th Conference on Winter Simulation, New York, NY, USA, ACM Press 289–299 (1992)
Fonseca, J., Navaresse, D.O., Moynihan, G.P.: Simulation metamodeling through artificial neural networks. Engineering Applications of Artificial Intelligence 16(3) 177–183 (2003)
Kleijnen, J.P.C., Sargent, R.G.: A methodology of fitting and validating metamodels in simulation. European Journal of Operational Research 120 14–29 (2000)
Aha, D.W.: Editorial. Artif. Intell. Rev. 11(1–5) (1997) 1–6 Special issue on lazy learning.
Shepard, D.: A two-dimensional interpolation function for irregularly-spaced data. In: Proceedings of the 1968 23rd ACM National Conference, New York, NY, USA, ACM Press 517–524 (1968)
Blanning, R.W.: The source and uses of sensivity information. Interfaces 4(4) 21–23 (1974)
Graening, L., Jin, Y., Sendhoff, B.: Individual-based management of metamodels for evolutionary optimization with application to three-dimensional blade optimization. In: Evolutionary Computation in Dynamic and Uncertain Environments. Springer-Verlag 225–250 (2007)
Chandra, R., Dagum, L., Kohr, D., Maydan, D., McDonald, J., Menon, R.: Parallel programming in OpenMP. Morgan Kaufmann Publishers Inc., San Francisco, CA, USA (2001)
Runarsson, T.P., Yao, X.: Stochastic ranking for constrained evolutionary optimization. IEEE Transactions on Evolutionary Computation 4(3) 284–294 (September 2000)
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Fonseca, L.G., Barbosa, H.J.C. & Lemonge, A.C.C. A similarity-based surrogate model for enhanced performance in genetic algorithms. OPSEARCH 46, 89–107 (2009). https://doi.org/10.1007/s12597-009-0006-1
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s12597-009-0006-1