Engineering Applications of Artificial Intelligence
Survey PaperA survey on applications of the harmony search algorithm
Introduction
The application of optimization algorithms to real world problems has gained momentum in the last decade. Dating back to the early 1940s, diverse traditional mathematical methods such as linear programming (LP), nonlinear programming (NLP) or dynamic programming (DP) were first employed for solving complex optimization problems by resorting to different relaxation methods of the underlying formulation. These techniques are capable of cost-efficiently obtaining a global optimal solution in problem models subject to certain particularities (e.g. optimal substructurability and subproblem overlap for dynamic programming), but unfortunately their application range does not cover the whole class of NP-complete problems, where an exact solution cannot be found in polynomial time. In fact, the solution space (and hence, the solving time) of the problem increases exponentially with the number of inputs, which makes them unfeasible for practical applications.
In order to cope up with the above shortcoming, meta-heuristic techniques—conceived as intelligent self-learning algorithms stemming from the study and mimicking of intelligent processes and behaviors arising in nature, sociology and other disciplines (see Fig. 1 for a general overview)—have come into sight for efficiently tacking this type of hard optimization paradigms. In many cases (especially when large-scale problems are addressed), meta-heuristics are deemed one of the most efficient alternatives to seek and determine a near-optimal solution1 without relying on exact yet computationally demanding algorithms, and by overcoming the major drawback of local search algorithms, i.e. getting trapped in biased local regions far from the sought global solution. Indeed, on the latter lies one of the main design challenges of modern meta-heuristic methods: to avoid local optima so as to get global optima, which can be achieved by exploring the whole search space through the use of intelligent stochastically driven operators. On this purpose, it is of utmost importance for enhancing the overall search performance of the meta-heuristic to balance the tradeoff between two important related aspects: diversification and intensification. On one hand, diversification refers to a form of randomization added to the deterministic components driving the algorithm operation in order to explore the search space in a diverse manner, while intensification stands for the improvement of existing partial solutions to the problem by exploring the vicinity of such solutions in the region space (e.g. by means of elitism mechanisms). If diversification is strong enough, a great number of zones of the search space may be loosely explored, which will reduce the convergence rate of the algorithm. By contrast, if diversification is kept low in the algorithm design, there is a significant risk of leaving a fraction of the solution space unexplored or even producing far-from-optimal solutions due to trapping in local optima. An appropriate intensification is carried out by means of exploiting the knowledge acquired from past solutions, and by properly fine-tuning the parameters that control the performance of the algorithm for a particular problem so as to enhance its convergence properties.
Based on the above rationale, the last four decades have witnessed the advent and rise of new innovative meta-heuristic algorithms, grouped on many branches based on their working methodology, and applied to a wide variety of different optimization problems such as resource allocation, industrial planning, scheduling, decision making, medical, engineering and computer science applications, among many others. Many of them are actually inspired by natural phenomena and have been developed by mimicking the intelligence characteristics of biological and physical agents such as Genetic Algorithms, Cuckoo Search, Ant Colony Optimization, Simulated Annealing, Particle Swarm Optimization or Harmony Search. The main difference among existing meta-heuristics concerns (1) the way diversification and intensification are mutually balanced in the algorithm thread; (2) the search path they follow; and/or (3) the observed phenomena that inspires the algorithm design. Another characteristic of meta-heuristic algorithms is the distinction between memory usage versus memoryless methods, as well as the definition of population-based and single-point search-based strategies. For instance, in Ant Colony Optimization a memory of previously obtained solutions is kept in the pheromone trail matrix in order to construct new refined solutions. Likewise, the population in Genetic Algorithms can be regarded as a memory register of recently encountered partial solutions. On the contrary, Simulated Annealing exemplifies the case of a memoryless algorithm in which no record of past solutions is employed.
In contrast to single-point search-based algorithms in which a unique solution is generated at each iteration, population-based meta-heuristic algorithms maintain a set of solutions (i.e. population) which evolve at each iteration. Therefore, population-based algorithms provide an efficient and natural way for exploring the search space and obtaining an acceptable solution. Interestingly, Memetic Algorithms hybridize these two strategies in order to balance the diversification and intensification of its search procedure. Most of single-point search algorithms such as Simulated Annealing and Tabu Search are based on a single neighborhood structure which defines the allowed movements through the search space. However, Iterated Local Search algorithms normally define two different neighborhood structures among which the algorithm shifts when reaching a local optimum.
This paper focuses on the principles of Harmony Search, its comparative analysis with respect to other meta-heuristics, and a detailed report of recent applications and associated developments attained during the last few years. This review allows the interested reader not only to get a clear insight on the design and working principles of this algorithm, but also to identify technical advances and application trends of Harmony Search in diverse fields. To fulfill these goals, the manuscript is structured as follows: once the structure and steps of the original algorithm have been posed in Section 1.1, its intrinsic characteristics are analyzed and compared to other meta-heuristics in Section 1.2. Then, an overview of its application areas and an outline of current trends in such fields is tackled by classifying the applications cases in the related literature by discipline area, and by highlighting the modifications and improvements of each particular field in 2 Application areas, 3 Overview of HS applications by application area. Finally, Section 4 draws some concluding remarks and outlines several future research lines of interest.
Harmony Search (hereafter HS) is a relatively new population-based meta-heuristic algorithm which has obtained excellent results in the field of combinatorial optimization (Geem, 2000). It mimics the behavior of a music orchestra when aiming at composing the most harmonious melody, as measured by aesthetic standards. When comparing the improvisation process of musicians with the optimization task, we can realize that each musician corresponds to a decision variable; the musical instrument's pitch range refers to the alphabet of the decision variable; the musical harmony improvised at a certain time corresponds to a solution vector at a given iteration; and audience's aesthetic impression links to the objective function of fitness of the optimization problem at hand. Just like musicians improve the melody time after time, the HS algorithm progressively enhances the fitness of the solution vector in an iterative fashion.
As previously mentioned, HS is a population-based algorithm; it hence maintains a set of solutions in the so-called Harmony Memory (HM). An estimation of the optimal solution is achieved at every iteration by applying a set of optimization parameters to the HM, which produces a new harmony vector every time. Fig. 2 illustrates the flow diagram of the HS algorithm, which can be summarized in four steps: (i) initialization of the HM; (ii) improvisation of a new harmony; (iii) inclusion of the newly generated harmony in the HM provided that its fitness improves the worst fitness value in the previous HM; and (iv) returning to step (ii) until a termination criteria (e.g. maximum number of iterations or fitness stall) is satisfied.
The above improvisation procedure is mainly controlled by two different probabilistic operators, which are sequentially applied to each note so as to produce a new set of improvised harmonies or candidate solutions:
- •
The Harmony Memory Considering Rate, HMCR , establishes the probability that the new value for a certain note is drawn uniformly from the values of this same note in all the remaining melodies. Otherwise (i.e. with a probability ), the note values are randomly chosen according to their alphabet. This case is commonly referred to as random consideration, as it increases the diversity of the solutions towards the global optimality; however, some works (e.g. Gil-Lopez et al., 2011) implement the random consideration as a third, separated probabilistic operator.
- •
The Pitch Adjusting Rate, PAR , establishes the probability that the new value xnew for a given note value x is obtained by adding a small random amount to the existing value xold. For continuous alphabets, this operation reduces to where represents the pitch bandwidth, and is a random number drawn from an uniform distribution with support . A low pitch adjusting rate with a narrow bandwidth may restrict the diversification of the algorithm within a small search subspace and consequently, decrease the convergence rate of the overall solver. On the other hand, a high pitch adjusting rate with a high value of may force the algorithm to unnecessarily escape from areas with potentially near-optimal solutions. When dealing with discrete alphabets, a similar definition holds for this PAR operator, the difference being that the underlying alphabet must be sorted under some vicinity ordering criteria, which in general depends roughly on the characteristics of the metric defining the problem at hand.
By analyzing these parameters in detail, it can be identified how the HS algorithm balances diversification and intensification. Both the pitch adjusting rate and the random consideration parameter control the diversification factor. The pitch adjustment can be essentially viewed as a refinement process of local solutions, while the randomization procedure allows examining the search space in a more explorative fashion. The intensification in the HS algorithm is represented by the harmony memory considering rate, which fixes the level of elitism, i.e. the amount of solutions from the harmony memory to be kept for subsequent generation. If the considering rate is set to a low value, the algorithm will converge slowly to the optimum solution; however, a high HMCR value will favor the intensification based on the knowledge contained in the present harmony memory, but at the risk of trapping the algorithm into local optima. In summary, it is essential to properly adjust the parameters controlling the above operators in order to boost the algorithm performance.
Given their strong dependency on the shape of the solution space drawn by the metric function at hand, the outperforming convergence and behavior of any meta-heuristic algorithm cannot be claimed in a general manner, but instead needs to be assessed by focusing on a certain problem, along with the side constraints that may hold in the mathematical formulation at hand. Therefore, even though a globally optimal algorithm that renders the best performance in all optimization schemes does not exist (in line with the statements of the so-called No Free Lunch Theorem Wolpert and MacReady, 1997), the HS algorithm has so far elucidated in practice a great potential and efficiency in comparison with other meta-heuristic methods in a wide spectrum of real applications. HS possesses a similar structure to other existing population-based meta-heuristic solvers, but it incorporates some distinctive features that make it widely utilized in the literature.
Similarly to other related population-based algorithms, i.e. Genetic Algorithms or Ant Colony Optimization, the HS relies on a group of solutions that can be simultaneously exploited for improving the efficiency of the algorithm. However, the naïve Genetic Algorithm considers only two vectors (referred to as parents) for generating a new solution or offspring, whereas the original implementation of HS takes into account, component-wise and on a probabilistic basis, all the existing solutions (melodies) in the harmony memory. Nevertheless, further modifications of the naïve Genetic Algorithm have been proposed in the literature, such as the multi-parent crossover. It modifies the original formulation of the algorithm to take into account more than two individuals in the generation of the new population. On the contrary, the HS Algorithm, in its original version, is able to infer new solutions merging the characteristics of all individuals by simply tuning the values of its probabilistic parameters. Besides, it independently operates on each constituent variable (note) of a solution vector (harmony), to which stochastic operators for fine-tuning and randomization are applied. As opposed to gradient-search techniques, the convergence rate of HS and the quality of its produced solutions are not dramatically affected by the initialized values of the constituent melodies in the harmony memory. Besides, HS utilizes a probabilistic gradient which, in contrast to traditional gradient-based mathematical methods, does not require the derivative of the fitness function to be analytically solvable, nor even differentiable over the whole solution space. Instead, the probabilistic gradient converges to progressively better solutions iteration by iteration, since the operators driving the algorithm behavior intelligently guide the harmony memory to regions of the solution space with better fitness without addressing at all the differentiability of the metric function. As a result, HS has been shown to perform satisfactorily in both continuous and discrete optimization problems: indeed, it is able to handle both decimal and binary alphabets without modifying the definition of the original HMCR and PAR parameters of the algorithm.
Another remarkable strength of HS hinges on its improvisation operators, which play a major role in achieving a good trade-off between diversification and intensification. As mentioned before, the correct choice of the parameters becomes essential in order to attain progressively better candidate solutions, and HS facilitates this refinement stage by significantly reducing the number of configurable parameters. In addition, the steps and the structure of the HS algorithm are relatively simple, which makes it flexible for its combination with other meta-heuristics (Fesanghary et al., 2008) and implementation in parallel hardware architectures. Consequently, since the advent of this algorithm, the research community has thoroughly proposed and examined variants of HS based on incorporating new components into the HS structure and/or hybridizing it with other solvers so as to improve its searching capabilities.
Section snippets
Application areas
Originally, applications where HS was first assessed as an effective meta-heuristic focused mainly on the design of water distribution networks (Geem, 2006), benchmark optimization (Li and Li, 2007), structural design (Lee et al., 2005) and vehicle routing problems (Geem, 2005, Geem et al., 2005). In 2004 a flowchart representation of HS was published in Lee and Geem (2004) and since then several studies were devoted to the development of new HS variants and hybridizations with other
Overview of HS applications by application area
In this section we delve into the previously presented discipline areas by describing some of the optimization challenges arising therein, as well as analyzing the modifications done to the original HS algorithm in order to efficiently cope with such problems.
Concluding remarks
This article has posed an overview of the recent applications where the music-inspired Harmony Search algorithm has been shown to be an effective meta-heuristic to solve computationally involved optimization paradigms. This overview is broken down into a set of categories divided by application area, which serves as an useful tool for experimented practitioners and beginners to get a brief description of the latest activity and trends around HS in every such area. For the sake of brevity each
Acknowledgments
This work was supported by the Gachon University research fund of 2012 (GCU-2012-R034).
References (189)
- et al.
Application of an improved harmony search algorithm in well placement optimization using streamline simulation
J. Petrol. Sci. Eng.
(2011) - et al.
Broadcast scheduling in packet radio networks using harmony search algorithm
Expert Syst. Appl.
(2012) Chaotic harmony search algorithms
Appl. Math. Comput.
(2010)- et al.
Novel selection schemes for harmony search
Appl. Math. Comput.
(2012) - et al.
Design and implementation of a harmony-search-based variable-strength T-way testing strategy with constraints support
Inform. Software Tech.
(2012) - et al.
GHS+LEMglobal-best harmony search using learnable evolution models
Appl. Math. Comput.
(2011) - et al.
An improved harmony search algorithm for power economic load dispatch
Energy Convers. Manage.
(2009) Improved harmony search algorithms for sizing optimization of truss structures
Comput. Struct.
(2012)- et al.
Iterative power and subcarrier allocation in rate-constrained OFDMA downlink systems based on harmony search heuristics
Eng. Appl. Artif. Intell.
(2011) - et al.
A comparative study of different local search application strategies in hybrid metaheuristics
Appl. Soft Comput.
(2013)