Abstract
What makes a problem easy or hard for a genetic algorithm (GA)? This question has become increasingly important as people have tried to apply the GA to ever more diverse types of problems. Much previous work on this question has studied the relationship between GA performance and the structure of a given fitness function when it is expressed as a Walsh polynomial. The work of Bethke, Goldberg, and others has produced certain theoretical results about this relationship. In this article we review these theoretical results, and then discuss a number of seemingly anomalous experimental results reported by Tanese concerning the performance of the GA on a subclass of Walsh polynomials, some members of which were expected to be easy for the GA to optimize. Tanese found that the GA was poor at optimizing all functions in this subclass, that a partitioning of a single large population into a number of smaller independent populations seemed to improve performance, and that hillclimbing outperformed both the original and partitioned forms of the GA on these functions. These results seemed to contradict several commonly held expectations about GAs.
We begin by reviewing schema processing in GAs. We then give an informal description of how Walsh analysis and Bethke's Walsh-schema transform relate to GA performance, and we discuss the relevance of this analysis for GA applications in optimization and machine learning. We then describe Tanese's surprising results, examine them experimentally and theoretically, and propose and evaluate some explanations. These explanations lead to a more fundamental question about GAs: what are the features of problems that determine the likelihood of successful GA performance?
Article PDF
Similar content being viewed by others
References
Belew, R.K. & Booker, L.B. (Eds.) (1991). Proceedings of the Fourth International Conference on Genetic Algorithms. San Mateo, CA: Morgan Kaufmann.
Bergman, A., & Feldman, M.W. (1990). More on selection for and against recombination. Theoretical Population Biology, 38 (1), 68–92.
Bethke, A.D. (1980). Genetic algorithms as function optimizers. (Doctoral dissertation, University of Michigan.) Dissertation Abstracts International, 41 (9), 3503B.
Das, R., & Whitley, L.D. (1991). The only challenging problems are deceptive: Global search by solving order-1 hyperplanes. In R.K. Belew & L.B. Booker (Eds.), Proceedings of the Fourth International Conference on Genetic Algorithms. San Mateo, CA: Morgan Kaufmann.
Davis, L.D. (Ed.) (1987). Genetic algorithms and simulated annealing. Research Notes in Artificial Intelligence. Los Altos, CA: Morgan Kauffmann.
Davis, L.D. (Ed.) (1991). The handbook of genetic algorithms. New York: Van Nostrand Reinhold.
De Jong, K.A. (1975). An analysis of the behavior of a class of genetic adaptive systems. Unpublished doctoral dissertation, University of Michigan, Ann Arbor, MI.
De Jong, K.A. (Ed.) (1990a). Special issue on genetic algorithms. Machine Learning, 5 (4).
De Jong, K.A. (1990b). Introduction to the second special issue on genetic algorithms. Machine Learning, 5 (4), 351–353.
De Jong, K.A., & Spears, W. (1991). Learning concept classification rules using genetic algorithms. In Proceedings of the Twelfth International Conference on Artificial Intelligence, 651–656. San Mateo, CA: Morgan Kaufmann.
Forrest, S. (1985). Documentation for prisoner's dilemma and norms programs that use the genetic algorithm. Unpublished report, University of Michigan, Ann Arbor, MI.
Forrest, S., & Mitchell, M. (1991). The performance of genetic algorithms on Walsh polynomials: Some anomalous results and their explanation. In R.K. Belew & L.B. Booker (Eds.), Proceedings of the Fourth International Conference on Genetic Algorithms. San Mateo, CA: Morgan Kaufmann.
Forrest, S., & Mitchell, M. (1992). Towards a stronger building-blocks hypothesis: Effects of relative building-block fitness on GA performance. In L.D. Whitley (Ed.), Foundations of genetic algorithms 2. San Mateo, CA: Morgan Kaufmann.
Goldberg, D.E. (1985). Optimal initial population size for binary-coded genetic algorithms (Technical Report TCGA Report No. 85001). Tuscaloosa, AL: University of Alabama.
Goldberg, D.E. (1987). Simple genetic algorithms and the minimal deceptive problem. In L.D. Davis (ed.), Genetic algorithms and simulated annealing. Research Notes in Artificial Intelligence. Los Altos, CA: Morgan Kaufmann.
Goldberg, D.E. (1989a). Genetic algorithms and Walsh functions: Part I, A gentle introduction. Complex Systems, 3, 129–152.
Goldberg, D.E. (1989b). Genetic algorithms and Walsh functions: Part II, Deception and its analysis. Complex Systems, 3, 153–171.
Goldberg, D.E. (1989c). Genetic algorithms in search, optimization, and machine learning. Reading, MA: Addison Wesley.
Goldberg, D.E. (1989d). Sizing populations for serial and parallel genetic algorithms. In J.D. Schaffer (Ed.), Proceedings of the Third International Conference on Genetic Algorithms. San Mateo, CA: Morgan Kaufmann.
Goldberg, D.E. (1991). Construction of high-order deceptive functions using low-order Walsh coefficients (Technical Report 90002). Urbana, IL: Illinois Genetic Algorithms Laboratory, Department of General Engineering, University of Illinois.
Goldberg, D.E., & Bridges, C.L. (1988). An analysis of a reordering operator on a GA-hard problem (Technical Report 88005). Tuscaloosa, AL: The Clearinghouse for Genetic Algorithms, Department of Engineering Mechanics, University of Alabama.
Goldberg, D.E., & Holland, J.H. (Eds.). (1988a). Special issue on genetic algorithms. Machine Learning, 3 (2–3).
Goldberg, D.E., & Holland, J.H. (1988b). Introduction to the first special issue on genetic algorithms. Machine Learning, 3 (2–3), 95–99.
Goldberg, D.E., Korb, B., & Deb, K. (1990). Messy genetic algorithms. Motivation, analysis, and first results. Complex Systems, 3, 493–530.
Grefenstette, J.J. (Ed.). (1985). Proceedings of the First International Conference on Genetic Algorithms and Their Applications. Hillsdale, NJ: Lawrence Erlbaum Associates.
Grefenstette, J.J. (Ed.). (1987). Genetic Algorithms and Their Applications: Proceedings of the Second International Conference on Genetic Algorithms. Hillsdale, NJ: Lawrence Erlbaum Associates.
Grefenstette, J.J., & Baker, J.E. (1989). How genetic algorithms work: A critical look at implicit parallelism. In J.D. Schaffer (Ed.), Proceedings of the Third International Conference on Genetic Algorithms. San Mateo, CA: Morgan Kaufmann.
Holland, J.H. (1975). Adaptation in natural and artificial systems. Ann Arbor, MI: University of Michigan Press.
Holland, J.H. (1986). Escaping brittleness: The possibilities of general-purpose learning algorithms applied to parallel rule-based systems. In R.S. Michalski, J.G. Carbonell, & T.M. Mitchell (Eds.), Machine learning (Vol. 2). San Mateo, CA: Morgan Kaufmann.
Holland, J.H (1988). The dynamics of searchers directed by genetic algorithms. In Y.C. Lee (Ed.), Evolution, learning, and cognition. Teaneck, NJ: World Scientific.
Holland, J.H. (1989). Using classifier systems to study adaptive nonlinear networks. In D.L. Stein (Ed.), Lectures in the sciences of complexity (Vol. 1). Reading, MA: Addison-Wesley.
Koza, J.R. (1990). Genetic programming: A paradigm for genetically breeding populations of computer programs to solve problems (Technical Report STAN-CS-90-1314). Stanford, CA: Department of Computer Science, Stanford University.
Liepins, G.E., & Vose, M.D. (1990). Representational issues in genetic optimization. Journal of Experimental and Theoretical Artificial Intelligence, 2, 101–115.
Miller, G.F., Todd, P.M., & Hegde, S.U. (1989). Designing neural networks using genetic algorithms. In J.D. Schaffer (Ed.), Proceedings of the Third International Conference on Genetic Algorithms (pp. 379–384). San Mateo, CA: Morgan Kaufmann.
Mitchell, M., Forrest, S., & Holland, J.H. (1992). The royal road for genetic algorithms: Fitness landscapes and GA peformance. In Proceedings of the First European Conference on Artificial Life. Cambridge, MA: MIT Press/Bradford Books.
Packard, N.H. (1990). A genetic learning algorithm for the analysis of complex data. Complex Systems, 4 (5).
Schaffer, J.D. (ed.). (1989). Proceedings of the Third International Conference on Genetic Algorithms. San Mateo, CA: Morgan Kaufmann.
Smith, S.F. (1980). A learning system based on genetic adaptive algorithms. Unpublished Ph.D. dissertation, Computer Science Department, University of Pittsburgh, Pittsburgh, PA.
Tanese, R. (1989a). Distributed genetic algorithms for function optimization. Unpublished doctoral dissertation, University of Michigan, Ann Arbor, MI.
Tanese, R. (1989b). Distributed genetic algorithms. In J.D. Schaffer (ed.), Proceedings of the Third International Conference on Genetic Algorithms. San Mateo, CA: Morgan Kaufmann.
Walsh, J.L. (1923). A closed set of orthogonal functions. American Journal of Mathematics, 55, 5–24.
Whitley, L.D. (1991). Fundamental principles of deception in genetic search. In G. Rawlins (Ed.), Foundations of genetic algorithms. San Mateo, Ca: Morgan Kaufmann.
Whitley, L.D., Dominic, S., & Das, R. (1991). Genetic reinforcement learning with multilayer neural networks. In R.K. Belew & L.B. Booker (Eds.), Proceedings of the Fourth International Conference on Genetic Algorithms (pp. 562–569). San Mateo, CA: Morgan Kaufmann.
Wilson, S.W. (1991). GA-easy does not imply steepest-ascent optimizable. In R.K. Belew & L.B. Booker (Eds.), Proceedings of The Fourth International Conference on Genetic Algorithms. San Mateo, CA: Morgan Kaufmann.
Author information
Authors and Affiliations
Rights and permissions
About this article
Cite this article
Forrest, S., Mitchell, M. What Makes a Problem Hard for a Genetic Algorithm? Some Anomalous Results and Their Explanation. Machine Learning 13, 285–319 (1993). https://doi.org/10.1023/A:1022626114466
Issue Date:
DOI: https://doi.org/10.1023/A:1022626114466