Skip to main content

2019 | OriginalPaper | Buchkapitel

No Free Lunch Theorem: A Review

verfasst von : Stavros P. Adam, Stamatios-Aggelos N. Alexandropoulos, Panos M. Pardalos, Michael N. Vrahatis

Erschienen in: Approximation and Optimization

Verlag: Springer International Publishing

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

The “No Free Lunch” theorem states that, averaged over all optimization problems, without re-sampling, all optimization algorithms perform equally well. Optimization, search, and supervised learning are the areas that have benefited more from this important theoretical concept. Formulation of the initial No Free Lunch theorem, very soon, gave rise to a number of research works which resulted in a suite of theorems that define an entire research field with significant results in other scientific areas where successfully exploring a search space is an essential and critical task. The objective of this paper is to go through the main research efforts that contributed to this research field, reveal the main issues, and disclose those points that are helpful in understanding the hypotheses, the restrictions, or even the inability of applying No Free Lunch theorems.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Fußnoten
1
Source: Google Scholar.
 
Literatur
1.
Zurück zum Zitat Al-Rifaie, M.M., Bishop, J.M.: Swarmic paintings and colour attention. In: International Conference on Evolutionary and Biologically Inspired Music and Art, pp. 97–108. Springer, Berlin (2013)CrossRef Al-Rifaie, M.M., Bishop, J.M.: Swarmic paintings and colour attention. In: International Conference on Evolutionary and Biologically Inspired Music and Art, pp. 97–108. Springer, Berlin (2013)CrossRef
2.
Zurück zum Zitat Al-Rifaie, M.M., Bishop, J.M., Caines, S.: Creativity and autonomy in swarm intelligence systems. Cogn. Comput. 4(3), 320–331 (2012)CrossRef Al-Rifaie, M.M., Bishop, J.M., Caines, S.: Creativity and autonomy in swarm intelligence systems. Cogn. Comput. 4(3), 320–331 (2012)CrossRef
3.
Zurück zum Zitat Amari, S., Murata, N., Muller, K.R., Finke, M., Yang, H.H.: Asymptotic statistical theory of overtraining and cross-validation. IEEE Trans. Neural Netw. 8(5), 985–996 (1997)CrossRef Amari, S., Murata, N., Muller, K.R., Finke, M., Yang, H.H.: Asymptotic statistical theory of overtraining and cross-validation. IEEE Trans. Neural Netw. 8(5), 985–996 (1997)CrossRef
4.
Zurück zum Zitat Auger, A., Teytaud, O.: Continuous lunches are free plus the design of optimal optimization algorithms. Algorithmica 57(1), 121–146 (2010)MathSciNetMATHCrossRef Auger, A., Teytaud, O.: Continuous lunches are free plus the design of optimal optimization algorithms. Algorithmica 57(1), 121–146 (2010)MathSciNetMATHCrossRef
5.
Zurück zum Zitat Auger, A., Schoenauer, M., Teytaud, O.: Local and global order 3/2 convergence of a surrogate evolutionary algorithm. In: Proceedings of the 7th Annual Conference on Genetic and Evolutionary Computation, pp. 857–864. ACM, New York (2005) Auger, A., Schoenauer, M., Teytaud, O.: Local and global order 3/2 convergence of a surrogate evolutionary algorithm. In: Proceedings of the 7th Annual Conference on Genetic and Evolutionary Computation, pp. 857–864. ACM, New York (2005)
6.
Zurück zum Zitat Bellman, R.: Dynamic Programming. Princeton University Press, Princeton (1957)MATH Bellman, R.: Dynamic Programming. Princeton University Press, Princeton (1957)MATH
7.
Zurück zum Zitat Cataltepe, Z., Abu-Mostafa, Y.S., Magdon-Ismail, M.: No free lunch for early stopping. Neural Comput. 11(4), 995–1009 (1999)CrossRef Cataltepe, Z., Abu-Mostafa, Y.S., Magdon-Ismail, M.: No free lunch for early stopping. Neural Comput. 11(4), 995–1009 (1999)CrossRef
8.
Zurück zum Zitat Ciucu, F., Schmitt, J.: Perspectives on network calculus: no free lunch, but still good value. ACM SIGCOMM Comput. Commun. Rev. 42(4), 311–322 (2012)CrossRef Ciucu, F., Schmitt, J.: Perspectives on network calculus: no free lunch, but still good value. ACM SIGCOMM Comput. Commun. Rev. 42(4), 311–322 (2012)CrossRef
9.
Zurück zum Zitat Corne, D., Knowles, J.: Some multiobjective optimizers are better than others. In: IEEE Congress on Evolutionary Computation (CEC 2003), vol. 4, pp. 2506–2512. IEEE, Piscataway (2003) Corne, D., Knowles, J.: Some multiobjective optimizers are better than others. In: IEEE Congress on Evolutionary Computation (CEC 2003), vol. 4, pp. 2506–2512. IEEE, Piscataway (2003)
10.
Zurück zum Zitat Dembski, W.A.: No Free Lunch: Why Specified Complexity Cannot be Purchased Without Intelligence. Rowman & Littlefield, Langham (2006).MATH Dembski, W.A.: No Free Lunch: Why Specified Complexity Cannot be Purchased Without Intelligence. Rowman & Littlefield, Langham (2006).MATH
11.
Zurück zum Zitat Dorigo, M., Birattari, M.: Ant colony optimization. In: Encyclopedia of Machine Learning, pp. 36–39. Springer, Boston (2011) Dorigo, M., Birattari, M.: Ant colony optimization. In: Encyclopedia of Machine Learning, pp. 36–39. Springer, Boston (2011)
12.
Zurück zum Zitat Drettakis, G., Roussou, M., Reche, A., Tsingos, N.: Design and evaluation of a real-world virtual environment for architecture and urban planning. Presence Teleop. Virt. 16(3), 318–332 (2007)CrossRef Drettakis, G., Roussou, M., Reche, A., Tsingos, N.: Design and evaluation of a real-world virtual environment for architecture and urban planning. Presence Teleop. Virt. 16(3), 318–332 (2007)CrossRef
13.
Zurück zum Zitat Droste, S., Jansen, T., Wegener, I.: Optimization with randomized search heuristics – the (A)NFL theorem, realistic scenarios, and difficult functions. Theor. Comput. Sci. 287(1), 131–144 (2002)MathSciNetMATHCrossRef Droste, S., Jansen, T., Wegener, I.: Optimization with randomized search heuristics – the (A)NFL theorem, realistic scenarios, and difficult functions. Theor. Comput. Sci. 287(1), 131–144 (2002)MathSciNetMATHCrossRef
14.
Zurück zum Zitat Eberhart, R., Kennedy, J.: A new optimizer using particle swarm theory. In: Proceedings of the IEEE Sixth International Symposium on Micro Machine and Human Science, 1995, MHS’95, pp. 39–43. IEEE, Piscataway (1995) Eberhart, R., Kennedy, J.: A new optimizer using particle swarm theory. In: Proceedings of the IEEE Sixth International Symposium on Micro Machine and Human Science, 1995, MHS’95, pp. 39–43. IEEE, Piscataway (1995)
15.
Zurück zum Zitat Epitropakis, M.G., Plagianakos, V.P., Vrahatis, M.N.: Evolutionary adaptation of the differential evolution control parameters. In: Proceedings of the IEEE Congress on Evolutionary Computation, 2009, CEC’09, pp. 1359–1366. IEEE, Piscataway (2009) Epitropakis, M.G., Plagianakos, V.P., Vrahatis, M.N.: Evolutionary adaptation of the differential evolution control parameters. In: Proceedings of the IEEE Congress on Evolutionary Computation, 2009, CEC’09, pp. 1359–1366. IEEE, Piscataway (2009)
16.
Zurück zum Zitat Epitropakis, M.G., Tasoulis, D.K., Pavlidis, N.G., Plagianakos, V.P., Vrahatis, M.N.: Enhancing differential evolution utilizing proximity-based mutation operators. IEEE Trans. Evol. Comput. 15(1), 99–119 (2011)CrossRef Epitropakis, M.G., Tasoulis, D.K., Pavlidis, N.G., Plagianakos, V.P., Vrahatis, M.N.: Enhancing differential evolution utilizing proximity-based mutation operators. IEEE Trans. Evol. Comput. 15(1), 99–119 (2011)CrossRef
17.
Zurück zum Zitat Epitropakis, M.G., Plagianakos, V.P., Vrahatis, M.N.: Evolving cognitive and social experience in particle swarm optimization through differential evolution: a hybrid approach. Inf. Sci. 216, 50–92 (2012)CrossRef Epitropakis, M.G., Plagianakos, V.P., Vrahatis, M.N.: Evolving cognitive and social experience in particle swarm optimization through differential evolution: a hybrid approach. Inf. Sci. 216, 50–92 (2012)CrossRef
18.
Zurück zum Zitat Ficici, S.G.: Solution Concepts in Coevolutionary Algorithms. PhD thesis, Brandeis University Waltham, Waltham (2004) Ficici, S.G.: Solution Concepts in Coevolutionary Algorithms. PhD thesis, Brandeis University Waltham, Waltham (2004)
19.
Zurück zum Zitat Floudas, C.A., Pardalos, P.M.: Encyclopedia of Optimization. Springer Science & Business Media B.V., Dordrecht (2008)MATH Floudas, C.A., Pardalos, P.M.: Encyclopedia of Optimization. Springer Science & Business Media B.V., Dordrecht (2008)MATH
20.
Zurück zum Zitat Georgiou, V.L., Malefaki, S., Parsopoulos, K.E., Alevizos, Ph.D., Vrahatis, M.N.: Expeditive extensions of evolutionary Bayesian probabilistic neural networks. In: Third International Conference on Learning and Intelligent Optimization (LION3 2009). Lecture Notes in Computer Science, vol. 5851, pp. 30–44. Springer, Berlin (2009) Georgiou, V.L., Malefaki, S., Parsopoulos, K.E., Alevizos, Ph.D., Vrahatis, M.N.: Expeditive extensions of evolutionary Bayesian probabilistic neural networks. In: Third International Conference on Learning and Intelligent Optimization (LION3 2009). Lecture Notes in Computer Science, vol. 5851, pp. 30–44. Springer, Berlin (2009)
21.
Zurück zum Zitat Gómez, D., Rojas, A.: An empirical overview of the no free lunch theorem and its effect on real-world machine learning classification. Neural Comput. 28(1), 216–228 (2015)MathSciNetMATHCrossRef Gómez, D., Rojas, A.: An empirical overview of the no free lunch theorem and its effect on real-world machine learning classification. Neural Comput. 28(1), 216–228 (2015)MathSciNetMATHCrossRef
22.
Zurück zum Zitat Goutte, C.: Note on free lunches and cross-validation. Neural Comput. 9(6), 1245–1249 (1997)CrossRef Goutte, C.: Note on free lunches and cross-validation. Neural Comput. 9(6), 1245–1249 (1997)CrossRef
23.
Zurück zum Zitat Griffiths, E.G., Orponen, P.: Optimization, block designs and no free lunch theorems. Inf. Process. Lett. 94(2), 55–61 (2005)MathSciNetMATHCrossRef Griffiths, E.G., Orponen, P.: Optimization, block designs and no free lunch theorems. Inf. Process. Lett. 94(2), 55–61 (2005)MathSciNetMATHCrossRef
24.
Zurück zum Zitat Ho, Y.C.: The no free lunch theorem and the human-machine interface. IEEE Control. Syst. 19(3), 8–10 (1999)CrossRef Ho, Y.C.: The no free lunch theorem and the human-machine interface. IEEE Control. Syst. 19(3), 8–10 (1999)CrossRef
25.
Zurück zum Zitat Hopkins, D.A., Thomas, M.: Neural network and regression methods demonstrated in the design optimization of a subsonic aircraft. Structural Mechanics and Dynamics Branch 2002 Annual Report, p. 25 (2003) Hopkins, D.A., Thomas, M.: Neural network and regression methods demonstrated in the design optimization of a subsonic aircraft. Structural Mechanics and Dynamics Branch 2002 Annual Report, p. 25 (2003)
26.
Zurück zum Zitat Hume, D. (Introduction by Mossner, E.C.): A Treatise of Human Nature. Classics Series. Penguin Books Limited, London (1986) Hume, D. (Introduction by Mossner, E.C.): A Treatise of Human Nature. Classics Series. Penguin Books Limited, London (1986)
27.
Zurück zum Zitat Hume, D.: A Treatise of Human Nature. The Floating Press Ltd., Auckland (2009). First published in 1740 Hume, D.: A Treatise of Human Nature. The Floating Press Ltd., Auckland (2009). First published in 1740
28.
Zurück zum Zitat Karaboga, D., Basturk, B.: A powerful and efficient algorithm for numerical function optimization: artificial bee colony (ABC) algorithm. J. Glob. Optim. 39(3), 459–471 (2007)MathSciNetMATHCrossRef Karaboga, D., Basturk, B.: A powerful and efficient algorithm for numerical function optimization: artificial bee colony (ABC) algorithm. J. Glob. Optim. 39(3), 459–471 (2007)MathSciNetMATHCrossRef
29.
Zurück zum Zitat Kimbrough, S.O., Koehler, G.J., Lu, M., Wood, D.H.: On a feasible–infeasible two-population (FI-2Pop) genetic algorithm for constrained optimization: distance tracing and no free lunch. Eur. J. Oper. Res. 190(2), 310–327 (2008)MathSciNetMATHCrossRef Kimbrough, S.O., Koehler, G.J., Lu, M., Wood, D.H.: On a feasible–infeasible two-population (FI-2Pop) genetic algorithm for constrained optimization: distance tracing and no free lunch. Eur. J. Oper. Res. 190(2), 310–327 (2008)MathSciNetMATHCrossRef
30.
Zurück zum Zitat Kleijnen, J.P.C.: Sensitivity analysis of simulation experiments: regression analysis and statistical design. Math. Comput. Simul. 34(3–4), 297–315 (1992)MATHCrossRef Kleijnen, J.P.C.: Sensitivity analysis of simulation experiments: regression analysis and statistical design. Math. Comput. Simul. 34(3–4), 297–315 (1992)MATHCrossRef
31.
Zurück zum Zitat Kocsis, L., Szepesvari, C.: Bandit-based Monte-Carlo planning. In: European Conference on Machine Learning (ECML 2006). Lecture Notes in Computer Science, vol. 4212, pp. 282–293. Springer, Berlin (2006) Kocsis, L., Szepesvari, C.: Bandit-based Monte-Carlo planning. In: European Conference on Machine Learning (ECML 2006). Lecture Notes in Computer Science, vol. 4212, pp. 282–293. Springer, Berlin (2006)
32.
Zurück zum Zitat Köppen M.: Some technical remarks on the proof of the no free lunch theorem. In: Proceedings of the Fifth Joint Conference on Information Sciences (JCIS), vol. 1, pp. 1020–1024. Atlantic City (2000) Köppen M.: Some technical remarks on the proof of the no free lunch theorem. In: Proceedings of the Fifth Joint Conference on Information Sciences (JCIS), vol. 1, pp. 1020–1024. Atlantic City (2000)
33.
Zurück zum Zitat Köppen, M., Wolpert, D.H., Macready, W.G.: Remarks on a recent paper on the “No Free Lunch” theorems. IEEE Trans. Evol. Comput. 5(3), 295–296 (2001)CrossRef Köppen, M., Wolpert, D.H., Macready, W.G.: Remarks on a recent paper on the “No Free Lunch” theorems. IEEE Trans. Evol. Comput. 5(3), 295–296 (2001)CrossRef
34.
Zurück zum Zitat Laskari, E.C., Parsopoulos, K.E., Vrahatis, M.N.: Utilizing evolutionary operators in global optimization with dynamic search trajectories. Numer. Algorithms 34(2–4), 393–403 (2003)MathSciNetMATHCrossRef Laskari, E.C., Parsopoulos, K.E., Vrahatis, M.N.: Utilizing evolutionary operators in global optimization with dynamic search trajectories. Numer. Algorithms 34(2–4), 393–403 (2003)MathSciNetMATHCrossRef
35.
36.
Zurück zum Zitat Marshall, J.A.R., Hinton, T.G.: Beyond no free lunch: Realistic algorithms for arbitrary problem classes. In: IEEE Congress on Evolutionary Computation, pp. 1–6. IEEE, Piscataway (2010) Marshall, J.A.R., Hinton, T.G.: Beyond no free lunch: Realistic algorithms for arbitrary problem classes. In: IEEE Congress on Evolutionary Computation, pp. 1–6. IEEE, Piscataway (2010)
38.
Zurück zum Zitat Parsopoulos, K.E., Vrahatis, M.N.: Recent approaches to global optimization problems through particle swarm optimization. Nat. Comput. 1(2–3), 235–306 (2002)MathSciNetMATHCrossRef Parsopoulos, K.E., Vrahatis, M.N.: Recent approaches to global optimization problems through particle swarm optimization. Nat. Comput. 1(2–3), 235–306 (2002)MathSciNetMATHCrossRef
39.
Zurück zum Zitat Parsopoulos, K.E., Vrahatis, M.N.: On the computation of all global minimizers through particle swarm optimization. IEEE Trans. Evol. Comput. 8(3), 211–224 (2004)CrossRef Parsopoulos, K.E., Vrahatis, M.N.: On the computation of all global minimizers through particle swarm optimization. IEEE Trans. Evol. Comput. 8(3), 211–224 (2004)CrossRef
40.
Zurück zum Zitat Parsopoulos, K.E., Vrahatis, M.N.: Parameter selection and adaptation in unified particle swarm optimization. Math. Comput. Model. 46(1–2), 198–213 (2007)MathSciNetMATHCrossRef Parsopoulos, K.E., Vrahatis, M.N.: Parameter selection and adaptation in unified particle swarm optimization. Math. Comput. Model. 46(1–2), 198–213 (2007)MathSciNetMATHCrossRef
41.
Zurück zum Zitat Parsopoulos, K.E., Vrahatis, M.N.: Particle Swarm Optimization and Intelligence: Advances and Applications. Information Science Publishing (IGI Global), Hershey (2010)CrossRef Parsopoulos, K.E., Vrahatis, M.N.: Particle Swarm Optimization and Intelligence: Advances and Applications. Information Science Publishing (IGI Global), Hershey (2010)CrossRef
42.
Zurück zum Zitat Perakh, M.: The No Free Lunch Theorems and Their Application to Evolutionary Algorithms (2003) Perakh, M.: The No Free Lunch Theorems and Their Application to Evolutionary Algorithms (2003)
43.
Zurück zum Zitat Petalas, Y.G., Parsopoulos, K.E., Vrahatis, M.N.: Memetic particle swarm optimization. Ann. Oper. Res. 156(1), 99–127 (2007)MathSciNetMATHCrossRef Petalas, Y.G., Parsopoulos, K.E., Vrahatis, M.N.: Memetic particle swarm optimization. Ann. Oper. Res. 156(1), 99–127 (2007)MathSciNetMATHCrossRef
44.
Zurück zum Zitat Poli, R., Graff, M.: There is a free lunch for hyper-heuristics, genetic programming and computer scientists. In: Proceedings of the 12th European Conference on Genetic Programming, EuroGP ’09, pp. 195–207. Springer, Berlin (2009) Poli, R., Graff, M.: There is a free lunch for hyper-heuristics, genetic programming and computer scientists. In: Proceedings of the 12th European Conference on Genetic Programming, EuroGP ’09, pp. 195–207. Springer, Berlin (2009)
45.
Zurück zum Zitat Poli, R., Graff, M., McPhee, N.F.: Free lunches for function and program induction. In: Proceedings of the Tenth ACM SIGEVO Workshop on Foundations of Genetic Algorithms, FOGA ’09, pp. 183–194. ACM, New York (2009) Poli, R., Graff, M., McPhee, N.F.: Free lunches for function and program induction. In: Proceedings of the Tenth ACM SIGEVO Workshop on Foundations of Genetic Algorithms, FOGA ’09, pp. 183–194. ACM, New York (2009)
46.
Zurück zum Zitat Rivals, I., Personnaz, L.: On cross validation for model selection. Neural Comput. 11(4), 863–870 (1999)CrossRef Rivals, I., Personnaz, L.: On cross validation for model selection. Neural Comput. 11(4), 863–870 (1999)CrossRef
47.
Zurück zum Zitat Rosenberg, L.B.: Human swarms, a real-time paradigm for collective intelligence. Collective Intelligence (2015) Rosenberg, L.B.: Human swarms, a real-time paradigm for collective intelligence. Collective Intelligence (2015)
48.
Zurück zum Zitat Schumacher, C., Vose, M.D., Whitley, L.D.: The no free lunch and problem description length. In: Proceedings of the 3rd Annual Conference on Genetic and Evolutionary Computation, pp. 565–570. Morgan Kaufmann Publishers Inc., Burlington (2001) Schumacher, C., Vose, M.D., Whitley, L.D.: The no free lunch and problem description length. In: Proceedings of the 3rd Annual Conference on Genetic and Evolutionary Computation, pp. 565–570. Morgan Kaufmann Publishers Inc., Burlington (2001)
49.
Zurück zum Zitat Service, T.C., Tauritz, D.R.: A no-free-lunch framework for coevolution. In: Proceedings of the 10th Annual Conference on Genetic and Evolutionary Computation, pp. 371–378. ACM, Piscataway (2008) Service, T.C., Tauritz, D.R.: A no-free-lunch framework for coevolution. In: Proceedings of the 10th Annual Conference on Genetic and Evolutionary Computation, pp. 371–378. ACM, Piscataway (2008)
50.
Zurück zum Zitat Sotiropoulos, D.G., Stavropoulos, E.C., Vrahatis, M.N.: A new hybrid genetic algorithm for global optimization. Nonlinear Anal. Theory Methods Appl. 30(7), 4529–4538 (1997)MathSciNetMATHCrossRef Sotiropoulos, D.G., Stavropoulos, E.C., Vrahatis, M.N.: A new hybrid genetic algorithm for global optimization. Nonlinear Anal. Theory Methods Appl. 30(7), 4529–4538 (1997)MathSciNetMATHCrossRef
51.
Zurück zum Zitat Teytaud, O., Flory, S.: Upper confidence trees with short term partial information. In: European Conference on the Applications of Evolutionary Computation, pp. 153–162. Springer, Berlin (2011)CrossRef Teytaud, O., Flory, S.: Upper confidence trees with short term partial information. In: European Conference on the Applications of Evolutionary Computation, pp. 153–162. Springer, Berlin (2011)CrossRef
52.
Zurück zum Zitat Thalmann, D.: Crowd Simulation. Wiley Online Library (2007) Thalmann, D.: Crowd Simulation. Wiley Online Library (2007)
53.
Zurück zum Zitat Van Grieken, M.: Optimisation pour l’apprentissage et apprentissage pour l’optimisation. PhD thesis, Université Paul Sabatier-Toulouse III, Toulouse (2004) Van Grieken, M.: Optimisation pour l’apprentissage et apprentissage pour l’optimisation. PhD thesis, Université Paul Sabatier-Toulouse III, Toulouse (2004)
54.
Zurück zum Zitat Vanaret, C., Gallard, F., Martins, J.: On the consequences of the “No Free Lunch” theorem for optimization on the choice of an appropriate MDO architecture. In: 18th AIAA/ISSMO Multidisciplinary Analysis and Optimization Conference, pp. 3148 (2017) Vanaret, C., Gallard, F., Martins, J.: On the consequences of the “No Free Lunch” theorem for optimization on the choice of an appropriate MDO architecture. In: 18th AIAA/ISSMO Multidisciplinary Analysis and Optimization Conference, pp. 3148 (2017)
55.
Zurück zum Zitat Wolpert, D.H.: On the connection between in-sample testing and generalization error. Complex Syst. 6(1), 47–94 (1992)MathSciNetMATH Wolpert, D.H.: On the connection between in-sample testing and generalization error. Complex Syst. 6(1), 47–94 (1992)MathSciNetMATH
56.
Zurück zum Zitat Wolpert, D.H.: The lack of a priori distinctions between learning algorithms. Neural Comput. 8(7), 1341–1390 (1996)CrossRef Wolpert, D.H.: The lack of a priori distinctions between learning algorithms. Neural Comput. 8(7), 1341–1390 (1996)CrossRef
57.
Zurück zum Zitat Wolpert, D.H.: The existence of a priori distinctions between learning algorithms. Neural Comput. 8(7), 1391–1420 (1996)CrossRef Wolpert, D.H.: The existence of a priori distinctions between learning algorithms. Neural Comput. 8(7), 1391–1420 (1996)CrossRef
58.
Zurück zum Zitat Wolpert, D.H.: The supervised learning no-free-lunch theorems. In: Soft Computing and Industry, pp. 25–42. Springer, London (2002)CrossRef Wolpert, D.H.: The supervised learning no-free-lunch theorems. In: Soft Computing and Industry, pp. 25–42. Springer, London (2002)CrossRef
59.
Zurück zum Zitat Wolpert, D.H.: The Supervised Learning No-Free-Lunch Theorems, pp. 25–42. Springer, London (2002)CrossRef Wolpert, D.H.: The Supervised Learning No-Free-Lunch Theorems, pp. 25–42. Springer, London (2002)CrossRef
60.
Zurück zum Zitat Wolpert, D.H.: What the no free lunch theorems really mean; how to improve search algorithms. SFI working paper: 2012–10-017. Santa Fe Institute, Santa Fe (2012) Wolpert, D.H.: What the no free lunch theorems really mean; how to improve search algorithms. SFI working paper: 2012–10-017. Santa Fe Institute, Santa Fe (2012)
61.
Zurück zum Zitat Wolpert, D.H., Macready, W.G.: No Free Lunch Theorems for Search. Tech. Rep. SFI-TR-95-02-010. Santa Fe Institute, Santa Fe (1995) Wolpert, D.H., Macready, W.G.: No Free Lunch Theorems for Search. Tech. Rep. SFI-TR-95-02-010. Santa Fe Institute, Santa Fe (1995)
62.
Zurück zum Zitat Wolpert, D.H., Macready, W.G.: No free lunch theorems for optimization. IEEE Trans. Evol. Comput. 1(1), 67–82 (1997)CrossRef Wolpert, D.H., Macready, W.G.: No free lunch theorems for optimization. IEEE Trans. Evol. Comput. 1(1), 67–82 (1997)CrossRef
63.
Zurück zum Zitat Wolpert, D. H., Macready, W.G.: Coevolutionary free lunches. IEEE Trans. Evol. Comput. 9(6), 721–735 (2005)CrossRef Wolpert, D. H., Macready, W.G.: Coevolutionary free lunches. IEEE Trans. Evol. Comput. 9(6), 721–735 (2005)CrossRef
64.
Zurück zum Zitat Yang, X.S.: Firefly algorithm, stochastic test functions and design optimization. Int. J. Bio-Inspired Comput. 2(2), 78–84 (2010)CrossRef Yang, X.S.: Firefly algorithm, stochastic test functions and design optimization. Int. J. Bio-Inspired Comput. 2(2), 78–84 (2010)CrossRef
65.
Zurück zum Zitat Yang, X.S.: A new metaheuristic bat-inspired algorithm. In: Nature Inspired Cooperative Strategies for Optimization (NICSO 2010), pp. 65–74. Springer, Berlin (2010)CrossRef Yang, X.S.: A new metaheuristic bat-inspired algorithm. In: Nature Inspired Cooperative Strategies for Optimization (NICSO 2010), pp. 65–74. Springer, Berlin (2010)CrossRef
66.
Zurück zum Zitat Yang, X.S.: Swarm-based metaheuristic algorithms and no-free-lunch theorems. In: Theory and New Applications of Swarm Intelligence. InTech, London (2012) Yang, X.S.: Swarm-based metaheuristic algorithms and no-free-lunch theorems. In: Theory and New Applications of Swarm Intelligence. InTech, London (2012)
67.
Zurück zum Zitat Yang, X.S., Deb, S.: Cuckoo search via Lévy flights. In: Proceedings of the World Congress on Nature & Biologically Inspired Computing, 2009, NaBIC 2009. pp. 210–214. IEEE, Piscataway (2009) Yang, X.S., Deb, S.: Cuckoo search via Lévy flights. In: Proceedings of the World Congress on Nature & Biologically Inspired Computing, 2009, NaBIC 2009. pp. 210–214. IEEE, Piscataway (2009)
68.
Zurück zum Zitat Zhu, H., Rohwer, R.: No free lunch for cross-validation. Neural Comput. 8(7), 1421–1426 (1996)CrossRef Zhu, H., Rohwer, R.: No free lunch for cross-validation. Neural Comput. 8(7), 1421–1426 (1996)CrossRef
Metadaten
Titel
No Free Lunch Theorem: A Review
verfasst von
Stavros P. Adam
Stamatios-Aggelos N. Alexandropoulos
Panos M. Pardalos
Michael N. Vrahatis
Copyright-Jahr
2019
DOI
https://doi.org/10.1007/978-3-030-12767-1_5

Premium Partner