A new optimization method: Big Bang–Big Crunch

https://doi.org/10.1016/j.advengsoft.2005.04.005Get rights and content

Abstract

Nature is the principal source for proposing new optimization methods such as genetic algorithms (GA) and simulated annealing (SA) methods. All traditional evolutionary algorithms are heuristic population-based search procedures that incorporate random variation and selection. The main contribution of this study is that it proposes a novel optimization method that relies on one of the theories of the evolution of the universe; namely, the Big Bang and Big Crunch Theory. In the Big Bang phase, energy dissipation produces disorder and randomness is the main feature of this phase; whereas, in the Big Crunch phase, randomly distributed particles are drawn into an order. Inspired by this theory, an optimization algorithm is constructed, which will be called the Big Bang–Big Crunch (BB–BC) method that generates random points in the Big Bang phase and shrinks those points to a single representative point via a center of mass or minimal cost approach in the Big Crunch phase. It is shown that the performance of the new (BB–BC) method demonstrates superiority over an improved and enhanced genetic search algorithm also developed by the authors of this study, and outperforms the classical genetic algorithm (GA) for many benchmark test functions.

Introduction

The optimization problem can be summarized as finding the set of parameters {xn} that minimizes an objective function f(x0,…,xn) that is also referred to as the fitness function in evolutionary algorithms. Global optimization requires the arrival to the best possible decision in any given set of circumstances. Genetic algorithms (GA) [1], [2], [3], [4] are a class of stochastic and global optimization which model the basic principles and mechanisms of Mendelian genetics and evolutionary theory along with population genetics to tackle hard search and optimization problems. The main difference between the GA and the other classical search algorithms is that the GA operates on a ‘population’ of strings (chromosomes in genetic language) instead of one input parameter. This gives the GA the ability to globally search optimum parameter. However, the GA suffers from premature convergence, convergence speed and execution time problems. To overcome these difficulties, many publications on the GA and its associated operators such as roulette-wheel selection [1], byte-wise crossover [5], combined mutation operators [6], or Cauchy mutations [7] have been proposed.

Classical GA when coupled with a highly elitist selection property have the potential to gather all of the population to the fittest member; that is, the GA may fail to converge to the global optimum point depending upon the selective capacity of the fitness function (large variation of the fitness function for small variation of its input variables). In other words, a fitness function is called elitist if it exaggerates the small differences among the fitness values of the chromosomes. For instance, a hyperbolic function is more elitist than a linear function. The global convergence property of the algorithm is assured only through the mutation operator, which in turn creates new chromosomes at the expense of extra computational effort. Moreover, if the response surface is quite smooth then the GA requires more response (fitness) function evaluations compared to classical hill-climbing techniques. Some ‘more intelligent’ rules and/or hybrid techniques such as evolutionary-gradient search (EGS) have been added to the GA to overcome this slow convergence phenomena of the GA near the optimum solution [8], [9], [10], [11].

A priori knowledge may be added in the initialization stage as well. The straightforward approach is to take samples uniformly at random from a state space of possible solutions, thereby providing an unbiased initial population. This is often useful in benchmarking evolutionary algorithms, but when solving real problems, some information that would help to seed the initial population with useful hints is known. Biasing the initial population towards better solutions can often save a great deal of time that would otherwise be spent ‘reinventing the wheel’ [12]. However, all of these precautions cause the algorithms to be inflated by extra number of calculations.

In this study, the difference between classical genetic algorithms and combat genetic algorithm are first briefly explained and then a review of an improved and enhanced genetic search algorithm developed and named as the ‘combat genetic algorithm’ (C-GA) by the authors of this study [13] is reconsidered in Section 2. The Big Bang–Big Crunch (BB–BC) method, which is the main contribution of this study, is presented in Section 3. Next, the results of the Big Bang–Big Crunch method are compared with the results of the combat genetic algorithm (C-GA) in Section 4. Finally, the overall comparison results and the advantages of the proposed method are discussed in Section 5.

Section snippets

Genetic operators and combat genetic algorithm

The C-GA differs from classical GA in the respect that the diversity is kept high by reducing the aggressive convergence tendency of the reproduction operator. Although the selection or reproduction operators in GA are the principal elements that shift the chromosomes towards a local/global optimum point, due to the decreasing diversity these can be viewed as a source of premature convergence to a local optimum point. In both GA and C-GA, the increase of meaningful information is assured by

Big Bang–Big Crunch (BB–BC) method

Randomness can be seen as equivalent to the energy dissipation in nature while convergence to a local or global optimum point can be viewed as gravitational attraction. Since energy dissipation creates disorder from ordered particles, we will use randomness as a transformation from a converged solution (order) to the birth of totally new solution candidates (disorder or chaos).

The proposed method is similar to the GA in respect to creating an initial population randomly. The creation of the

Benchmark test results

The new algorithm BB–BC has been tested and compared with the C-GA on the benchmark problems taken from [11]. This list comprises some widely used test functions such as sphere, Rosenbrock, step, ellipsoid, Rastrigin and Ackley functions given in Table 1.

In these experiments, the normalized performance measure which is given bylog(f*)=log(f(x0)f(x*))is used. Here, x0 is chosen as [10, 10, 10,…, 10] to be a reference point. Each of the ameliorations is calculated for 30 independent trials and

Conclusions

When searching for the global optimum of complex problems, particularly in problems with many local optima, the main conflict is between ‘accuracy’, ‘reliability’ and ‘computation time’. In the case that traditional optimization methods fail to provide efficiently reliable results, evolutionary algorithms may constitute an interesting alternative. Nevertheless, classical GA or its variant combat approach may still suffer from excessively slow convergence; that is, they are generally sluggish in

References (15)

  • O. Tigli et al.

    A global optimization in controlling a plant

    Electr Power Syst Res

    (1994)
  • D.E. Goldberg

    Genetic algorithms in search, optimization and machine learning

    (1989)
  • J.H. Holland

    Adaptation in natural and artificial systems: an introductory analysis with applications to biology, control, and artificial intelligence

    (1992)
  • L.D. Ed

    Handbook of genetic algorithms

    (1991)
  • T. Bäck et al.

    Evolutionary computation: comments on the history and current state

    IEEE Trans Evol Comput

    (1997)
  • Y. Davidor

    Genetic algorithms and robotics

    World Sci Ser Rob Autom Syst

    (1991)
  • K. Chellapilla

    Combining mutation operators in evolutionary programming

    IEEE Trans Evol Comput

    (1998)
There are more references available in the full text version of this article.

Cited by (1269)

  • Human Evolutionary Optimization Algorithm

    2024, Expert Systems with Applications
  • Design optimizer for planar soft-growing robot manipulators

    2024, Engineering Applications of Artificial Intelligence
View all citing articles on Scopus
View full text