Skip to main content

1992 | Buch

Genetic Algorithms + Data Structures = Evolution Programs

verfasst von: Zbigniew Michalewicz

Verlag: Springer Berlin Heidelberg

Buchreihe : Artificial Intelligence

insite
SUCHEN

Über dieses Buch

'What does your Master teach?' asked a visitor. 'Nothing,' said the disciple. 'Then why does he give discourses?' 'He only points the way - he teaches nothing.' Anthony de Mello, One Minute Wisdom During the last three decades there has been a growing interest in algorithms which rely on analogies to natural processes. The emergence of massively par­ allel computers made these algorithms of practical interest. The best known algorithms in this class include evolutionary programming, genetic algorithms, evolution strategies, simulated annealing, classifier systems, and neural net­ works. Recently (1-3 October 1990) the University of Dortmund, Germany, hosted the First Workshop on Parallel Problem Solving from Nature [164]. This book discusses a subclass of these algorithms - those which are based on the principle of evolution (survival of the fittest). In such algorithms a popu­ lation of individuals (potential solutions) undergoes a sequence of unary (muta­ tion type) and higher order (crossover type) transformations. These individuals strive for survival: a selection scheme, biased towards fitter individuals, selects the next generation. After some number of generations, the program converges - the best individual hopefully represents the optimum solution. There are many different algorithms in this category. To underline the sim­ ilarities between them we use the common term "evolution programs" .

Inhaltsverzeichnis

Frontmatter

Introduction

Introduction
Abstract
During the last thirty years there has been a growing interest in problem solving systems based on principles of evolution and hereditary: such systems maintain a population of potential solutions, they have some selection process based on fitness of individuals, and some recombination operators. One type of such systems is a class of Evolution Strategies i.e., algorithms which imitate the principles of natural evolution for parameter optimization problems [149], [162] (Rechenberg, Schwefel). Fogel’s Evolutionary Programming [57] is a technique for searching through a space of small finite-state machines. Glover’s Scatter Search techniques [64] maintain a population of reference points and generate offspring by weighted linear combinations. Another type of evolution based systems are Holland’s Genetic Algorithms (GAs) [89]. In 1990, Koza [108] proposed an evolution based system to search for the most fit computer program to solve a particular problem.
Zbigniew Michalewicz

Genetic Algorithms

Frontmatter
1. GAs: What Are They?
Abstract
There is a large class of interesting problems for which no reasonably fast algorithms have been developed. Many of these problems are optimization problems that arise frequently in applications. Given such a hard optimization problem it is often possible to find an efficient algorithm whose solution is approximately optimal. For some hard optimization problems we can use probabilistic algorithms as well — these algorithms do not guarantee the optimum value, but by randomly choosing sufficiently many “witnesses” the probability of error may be made as small as we like.
Zbigniew Michalewicz
2. GAs: How Do They Work?
Abstract
In this chapter we discuss the actions of a genetic algorithm for a simple parameter optimization problem. We start with a few general comments; a detailed example follows.
Zbigniew Michalewicz
3. GAs: Why Do They Work?
Abstract
The theoretical foundations of genetic algorithms rely on a binary string representation of solutions, and on the notion of a schema (see e.g., [89]) — a template allowing exploration of similarities among chromosomes. A schema is built by introducing a don’t care symbol (*) into the alphabet of genes. A schema represents all strings (a hyperplane, or subset of the search space), which match it on all positions other than ‘*’.
Zbigniew Michalewicz
4. GAs: Selected Topics
Abstract
GA theory provides some explanation why, for a given problem formulation, we may obtain convergence to the sought optimal point. Unfortunately, practical applications do not always follow the theory, with the main reasons being:
  • the coding of the problem often moves the GA to operate in a different space than that of the problem itself,
  • there is a limit on the hypothetically unlimited number of iterations, and
  • there is a limit on the hypothetically unlimited population size.
Zbigniew Michalewicz

Numerical Optimization

Frontmatter
5. Binary or Float?
Abstract
As discussed in the previous chapter, there are some problems that GA applications encounter that sometimes delay, if not prohibit, finding the optimal solutions with the desired precision. One of the implications of these problems was premature convergence of the entire population to a non—global optimum (Chapter 4); other consequences include inability to perform fine local tuning and inability to operate in the presence of nontrivial constraints (Chapters 6 and 7).
Zbigniew Michalewicz
6. Fine Local Tuning
Abstract
Genetic algorithms display inherent difficulties in performing local search for numerical applications. Holland suggested [89] that the genetic algorithm should be used as a preprocessor to perform the initial search, before turning the search process over to a system that can employ domain knowledge to guide the local search. As observed in [81]:
“Like natural genetic systems, GAs progress by virtue of changing the distribution of high performance substructures in the overall population; individual structures are not the focus of attention. Once the high performance regions of the search space are identified by a GA, it may be useful to invoke a local search routine to optimize the members of the final population.”
Zbigniew Michalewicz
7. Handling Constraints
Abstract
The central problem in applications of genetic algorithms is that of constraints — few approaches to the constraint problem in genetic algorithms have previously been proposed. One of these uses penalty functions as an adjustment to the optimized objective function, other approaches use “decoders” or “repair” algorithms, which avoid building an illegal individual, or repair one, respectively. However, these approaches suffer from the disadvantage of being tailored to the specific problem and are not sufficiently general to handle a variety of problems.
Zbigniew Michalewicz
8. Evolution Strategies and Other Methods
Abstract
Evolution strategies (ESs) are algorithms which imitate the principles of natural evolution as a method to solve parameter optimization problems [7], [162]. They were developed in Germany during the 1960s. As stated in [162]:
“In 1963 two students at the Technical University of Berlin met and were soon collaborating on experiments which used the wind tunnel of the Institute of Flow Engineering. During the search for the optimal shapes of bodies in a flow, which was then a matter of laborious intuitive experimentation, the idea was conceived of proceeding strategically. However, attempts with the coordinate and simple gradient strategies were unsuccessful. Then one of the students, Ingo Rechenberg, now Professor of Bionics and Evolutionary Engineering, hit upon the idea of trying random changes in the parameters defining the shape, following the example of natural mutations. The evolution strategy was born.”
(The second student was Hans-Paul Schwefel, now Professor of Computer Science and Chair of System Analysis).
Zbigniew Michalewicz

Evolution Programs

Frontmatter
9. The Transportation Problem
Abstract
In Chapter 7 we compared different GA approaches for handling constraints using an example of the transportation problem. It seems that for this particular class of problems we can do better: we can use a more appropriate (natural) data structure (for a transportation problem, a matrix) and specialized “genetic” operators which operate on matrices. Such an evolution program would be much stronger method than GENOCOP: the GENOCOP optimizes any function with linear constraints, whereas the new evolution program optimizes only transportation problems (these problems have precisely n + k − 1 equalities, where n and k denote the number of sources and destinations, respectively; see the description of the transportation problem below). However, it would be very interesting to see what can we gain by introducing extra problem-specific knowledge into an evolution program.
Zbigniew Michalewicz
10. The Traveling Salesman Problem
Abstract
In the next chapter, we present several examples of evolution programs tailored to specific applications (graph drawing, minimum spanning tree, scheduling). The traveling salesman problem (TSP) is just one of such applications; however, we treat it as a special problem — the mother of all problems — and discuss it in a separate chapter. What are the reasons?
Zbigniew Michalewicz
11. Drawing Graphs, Scheduling, and Partitioning
Abstract
As stated in the Introduction, it seems that most researchers “modified” their implementations of genetic algorithms either by using non-standard chromosome representation or by designing problem-specific genetic operators (e.g., [63], [185], [29], [36], etc.) to accommodate the problem to be solved, thus building efficient evolution programs. The first two modifications were discussed in detail in the previous two chapters (Chapters 9 and 10) for the transportation problem and the traveling salesman problem, respectively. In this chapter, we have made a somewhat arbitrary selection of a few other evolution programs developed by the author and other researchers, which are based on non-standard chromosome representation and/or problem-specific knowledge operators. We present systems for the graph drawing problem (Section 11.1), scheduling problems (Section 11.2), the timetable problem (Section 11.3), and partitioning problems (Section 11.4). The described systems and the results of their applications provide an additional argument to support the evolution programming approach, which promotes creation of data structures together with operators for a particular class of problems.
Zbigniew Michalewicz
12. Machine Learning
Abstract
Machine learning is primarily devoted towards building computer programs able to construct new knowledge or to improve already possessed knowledge by using input information; much of this research employs heuristic approaches to learning rather than algorithmic ones. The most active research area in recent years [140] has continued to be symbolic empirical learning (SEL). This area is concerned with creating and/or modifying general symbolic descriptions, whose structure is unknown a priori. The most common topic in SEL is developing concept descriptions from concept examples [113], [140]. In particular, the problems in attribute-based spaces are of practical importance: in many such domains it is relatively easy to come up with a set of example events, on the other hand it is quite difficult to formulate hypotheses. The goal of a system implementing this kind of supervised learning is:
Given the initial set of example events and their membership in concepts, produce classification rules for the concepts present in the input set.
Zbigniew Michalewicz
Conclusions
Abstract
In this book we discussed different strategies, called Evolution Programs, which might be applied to hard optimization problems and which were based on the principle of evolution. Evolution programs borrow heavily from genetic algorithms. However, they incorporate problem-specific knowledge by using “natural” data structures and problem-sensitive “genetic” operators. The basic difference between GAs and EPs is that the former are classified as weak, problem-independent methods, which is not the case for the latter.
Zbigniew Michalewicz
Backmatter
Metadaten
Titel
Genetic Algorithms + Data Structures = Evolution Programs
verfasst von
Zbigniew Michalewicz
Copyright-Jahr
1992
Verlag
Springer Berlin Heidelberg
Electronic ISBN
978-3-662-02830-8
Print ISBN
978-3-662-02832-2
DOI
https://doi.org/10.1007/978-3-662-02830-8