Survey Paper
A survey on applications of the harmony search algorithm

https://doi.org/10.1016/j.engappai.2013.05.008Get rights and content

Abstract

This paper thoroughly reviews and analyzes the main characteristics and application portfolio of the so-called Harmony Search algorithm, a meta-heuristic approach that has been shown to achieve excellent results in a wide range of optimization problems. As evidenced by a number of studies, this algorithm features several innovative aspects in its operational procedure that foster its utilization in diverse fields such as construction, engineering, robotics, telecommunications, health and energy. This manuscript will go through the most recent literature on the application of Harmony Search to the aforementioned disciplines towards a three-fold goal: (1) to underline the good behavior of this modern meta-heuristic based on the upsurge of related contributions reported to date; (2) to set a bibliographic basis for future research trends focused on its applicability to other areas; (3) to provide an insightful analysis of future research lines gravitating on this meta-heuristic solver.

Introduction

The application of optimization algorithms to real world problems has gained momentum in the last decade. Dating back to the early 1940s, diverse traditional mathematical methods such as linear programming (LP), nonlinear programming (NLP) or dynamic programming (DP) were first employed for solving complex optimization problems by resorting to different relaxation methods of the underlying formulation. These techniques are capable of cost-efficiently obtaining a global optimal solution in problem models subject to certain particularities (e.g. optimal substructurability and subproblem overlap for dynamic programming), but unfortunately their application range does not cover the whole class of NP-complete problems, where an exact solution cannot be found in polynomial time. In fact, the solution space (and hence, the solving time) of the problem increases exponentially with the number of inputs, which makes them unfeasible for practical applications.

In order to cope up with the above shortcoming, meta-heuristic techniques—conceived as intelligent self-learning algorithms stemming from the study and mimicking of intelligent processes and behaviors arising in nature, sociology and other disciplines (see Fig. 1 for a general overview)—have come into sight for efficiently tacking this type of hard optimization paradigms. In many cases (especially when large-scale problems are addressed), meta-heuristics are deemed one of the most efficient alternatives to seek and determine a near-optimal solution1 without relying on exact yet computationally demanding algorithms, and by overcoming the major drawback of local search algorithms, i.e. getting trapped in biased local regions far from the sought global solution. Indeed, on the latter lies one of the main design challenges of modern meta-heuristic methods: to avoid local optima so as to get global optima, which can be achieved by exploring the whole search space through the use of intelligent stochastically driven operators. On this purpose, it is of utmost importance for enhancing the overall search performance of the meta-heuristic to balance the tradeoff between two important related aspects: diversification and intensification. On one hand, diversification refers to a form of randomization added to the deterministic components driving the algorithm operation in order to explore the search space in a diverse manner, while intensification stands for the improvement of existing partial solutions to the problem by exploring the vicinity of such solutions in the region space (e.g. by means of elitism mechanisms). If diversification is strong enough, a great number of zones of the search space may be loosely explored, which will reduce the convergence rate of the algorithm. By contrast, if diversification is kept low in the algorithm design, there is a significant risk of leaving a fraction of the solution space unexplored or even producing far-from-optimal solutions due to trapping in local optima. An appropriate intensification is carried out by means of exploiting the knowledge acquired from past solutions, and by properly fine-tuning the parameters that control the performance of the algorithm for a particular problem so as to enhance its convergence properties.

Based on the above rationale, the last four decades have witnessed the advent and rise of new innovative meta-heuristic algorithms, grouped on many branches based on their working methodology, and applied to a wide variety of different optimization problems such as resource allocation, industrial planning, scheduling, decision making, medical, engineering and computer science applications, among many others. Many of them are actually inspired by natural phenomena and have been developed by mimicking the intelligence characteristics of biological and physical agents such as Genetic Algorithms, Cuckoo Search, Ant Colony Optimization, Simulated Annealing, Particle Swarm Optimization or Harmony Search. The main difference among existing meta-heuristics concerns (1) the way diversification and intensification are mutually balanced in the algorithm thread; (2) the search path they follow; and/or (3) the observed phenomena that inspires the algorithm design. Another characteristic of meta-heuristic algorithms is the distinction between memory usage versus memoryless methods, as well as the definition of population-based and single-point search-based strategies. For instance, in Ant Colony Optimization a memory of previously obtained solutions is kept in the pheromone trail matrix in order to construct new refined solutions. Likewise, the population in Genetic Algorithms can be regarded as a memory register of recently encountered partial solutions. On the contrary, Simulated Annealing exemplifies the case of a memoryless algorithm in which no record of past solutions is employed.

In contrast to single-point search-based algorithms in which a unique solution is generated at each iteration, population-based meta-heuristic algorithms maintain a set of solutions (i.e. population) which evolve at each iteration. Therefore, population-based algorithms provide an efficient and natural way for exploring the search space and obtaining an acceptable solution. Interestingly, Memetic Algorithms hybridize these two strategies in order to balance the diversification and intensification of its search procedure. Most of single-point search algorithms such as Simulated Annealing and Tabu Search are based on a single neighborhood structure which defines the allowed movements through the search space. However, Iterated Local Search algorithms normally define two different neighborhood structures among which the algorithm shifts when reaching a local optimum.

This paper focuses on the principles of Harmony Search, its comparative analysis with respect to other meta-heuristics, and a detailed report of recent applications and associated developments attained during the last few years. This review allows the interested reader not only to get a clear insight on the design and working principles of this algorithm, but also to identify technical advances and application trends of Harmony Search in diverse fields. To fulfill these goals, the manuscript is structured as follows: once the structure and steps of the original algorithm have been posed in Section 1.1, its intrinsic characteristics are analyzed and compared to other meta-heuristics in Section 1.2. Then, an overview of its application areas and an outline of current trends in such fields is tackled by classifying the applications cases in the related literature by discipline area, and by highlighting the modifications and improvements of each particular field in 2 Application areas, 3 Overview of HS applications by application area. Finally, Section 4 draws some concluding remarks and outlines several future research lines of interest.

Harmony Search (hereafter HS) is a relatively new population-based meta-heuristic algorithm which has obtained excellent results in the field of combinatorial optimization (Geem, 2000). It mimics the behavior of a music orchestra when aiming at composing the most harmonious melody, as measured by aesthetic standards. When comparing the improvisation process of musicians with the optimization task, we can realize that each musician corresponds to a decision variable; the musical instrument's pitch range refers to the alphabet of the decision variable; the musical harmony improvised at a certain time corresponds to a solution vector at a given iteration; and audience's aesthetic impression links to the objective function of fitness of the optimization problem at hand. Just like musicians improve the melody time after time, the HS algorithm progressively enhances the fitness of the solution vector in an iterative fashion.

As previously mentioned, HS is a population-based algorithm; it hence maintains a set of solutions in the so-called Harmony Memory (HM). An estimation of the optimal solution is achieved at every iteration by applying a set of optimization parameters to the HM, which produces a new harmony vector every time. Fig. 2 illustrates the flow diagram of the HS algorithm, which can be summarized in four steps: (i) initialization of the HM; (ii) improvisation of a new harmony; (iii) inclusion of the newly generated harmony in the HM provided that its fitness improves the worst fitness value in the previous HM; and (iv) returning to step (ii) until a termination criteria (e.g. maximum number of iterations or fitness stall) is satisfied.

The above improvisation procedure is mainly controlled by two different probabilistic operators, which are sequentially applied to each note so as to produce a new set of improvised harmonies or candidate solutions:

  • The Harmony Memory Considering Rate, HMCR [0,1], establishes the probability that the new value for a certain note is drawn uniformly from the values of this same note in all the remaining melodies. Otherwise (i.e. with a probability 1HMCR), the note values are randomly chosen according to their alphabet. This case is commonly referred to as random consideration, as it increases the diversity of the solutions towards the global optimality; however, some works (e.g. Gil-Lopez et al., 2011) implement the random consideration as a third, separated probabilistic operator.

  • The Pitch Adjusting Rate, PAR [0,1], establishes the probability that the new value xnew for a given note value x is obtained by adding a small random amount to the existing value xold. For continuous alphabets, this operation reduces to xnew=xold+ωx·ϵ,where ωx represents the pitch bandwidth, and ϵ is a random number drawn from an uniform distribution with support [1,1]. A low pitch adjusting rate with a narrow bandwidth may restrict the diversification of the algorithm within a small search subspace and consequently, decrease the convergence rate of the overall solver. On the other hand, a high pitch adjusting rate with a high value of ωx may force the algorithm to unnecessarily escape from areas with potentially near-optimal solutions. When dealing with discrete alphabets, a similar definition holds for this PAR operator, the difference being that the underlying alphabet must be sorted under some vicinity ordering criteria, which in general depends roughly on the characteristics of the metric defining the problem at hand.

By analyzing these parameters in detail, it can be identified how the HS algorithm balances diversification and intensification. Both the pitch adjusting rate and the random consideration parameter control the diversification factor. The pitch adjustment can be essentially viewed as a refinement process of local solutions, while the randomization procedure allows examining the search space in a more explorative fashion. The intensification in the HS algorithm is represented by the harmony memory considering rate, which fixes the level of elitism, i.e. the amount of solutions from the harmony memory to be kept for subsequent generation. If the considering rate is set to a low value, the algorithm will converge slowly to the optimum solution; however, a high HMCR value will favor the intensification based on the knowledge contained in the present harmony memory, but at the risk of trapping the algorithm into local optima. In summary, it is essential to properly adjust the parameters controlling the above operators in order to boost the algorithm performance.

Given their strong dependency on the shape of the solution space drawn by the metric function at hand, the outperforming convergence and behavior of any meta-heuristic algorithm cannot be claimed in a general manner, but instead needs to be assessed by focusing on a certain problem, along with the side constraints that may hold in the mathematical formulation at hand. Therefore, even though a globally optimal algorithm that renders the best performance in all optimization schemes does not exist (in line with the statements of the so-called No Free Lunch Theorem Wolpert and MacReady, 1997), the HS algorithm has so far elucidated in practice a great potential and efficiency in comparison with other meta-heuristic methods in a wide spectrum of real applications. HS possesses a similar structure to other existing population-based meta-heuristic solvers, but it incorporates some distinctive features that make it widely utilized in the literature.

Similarly to other related population-based algorithms, i.e. Genetic Algorithms or Ant Colony Optimization, the HS relies on a group of solutions that can be simultaneously exploited for improving the efficiency of the algorithm. However, the naïve Genetic Algorithm considers only two vectors (referred to as parents) for generating a new solution or offspring, whereas the original implementation of HS takes into account, component-wise and on a probabilistic basis, all the existing solutions (melodies) in the harmony memory. Nevertheless, further modifications of the naïve Genetic Algorithm have been proposed in the literature, such as the multi-parent crossover. It modifies the original formulation of the algorithm to take into account more than two individuals in the generation of the new population. On the contrary, the HS Algorithm, in its original version, is able to infer new solutions merging the characteristics of all individuals by simply tuning the values of its probabilistic parameters. Besides, it independently operates on each constituent variable (note) of a solution vector (harmony), to which stochastic operators for fine-tuning and randomization are applied. As opposed to gradient-search techniques, the convergence rate of HS and the quality of its produced solutions are not dramatically affected by the initialized values of the constituent melodies in the harmony memory. Besides, HS utilizes a probabilistic gradient which, in contrast to traditional gradient-based mathematical methods, does not require the derivative of the fitness function to be analytically solvable, nor even differentiable over the whole solution space. Instead, the probabilistic gradient converges to progressively better solutions iteration by iteration, since the operators driving the algorithm behavior intelligently guide the harmony memory to regions of the solution space with better fitness without addressing at all the differentiability of the metric function. As a result, HS has been shown to perform satisfactorily in both continuous and discrete optimization problems: indeed, it is able to handle both decimal and binary alphabets without modifying the definition of the original HMCR and PAR parameters of the algorithm.

Another remarkable strength of HS hinges on its improvisation operators, which play a major role in achieving a good trade-off between diversification and intensification. As mentioned before, the correct choice of the parameters becomes essential in order to attain progressively better candidate solutions, and HS facilitates this refinement stage by significantly reducing the number of configurable parameters. In addition, the steps and the structure of the HS algorithm are relatively simple, which makes it flexible for its combination with other meta-heuristics (Fesanghary et al., 2008) and implementation in parallel hardware architectures. Consequently, since the advent of this algorithm, the research community has thoroughly proposed and examined variants of HS based on incorporating new components into the HS structure and/or hybridizing it with other solvers so as to improve its searching capabilities.

Section snippets

Application areas

Originally, applications where HS was first assessed as an effective meta-heuristic focused mainly on the design of water distribution networks (Geem, 2006), benchmark optimization (Li and Li, 2007), structural design (Lee et al., 2005) and vehicle routing problems (Geem, 2005, Geem et al., 2005). In 2004 a flowchart representation of HS was published in Lee and Geem (2004) and since then several studies were devoted to the development of new HS variants and hybridizations with other

Overview of HS applications by application area

In this section we delve into the previously presented discipline areas by describing some of the optimization challenges arising therein, as well as analyzing the modifications done to the original HS algorithm in order to efficiently cope with such problems.

Concluding remarks

This article has posed an overview of the recent applications where the music-inspired Harmony Search algorithm has been shown to be an effective meta-heuristic to solve computationally involved optimization paradigms. This overview is broken down into a set of categories divided by application area, which serves as an useful tool for experimented practitioners and beginners to get a brief description of the latest activity and trends around HS in every such area. For the sake of brevity each

Acknowledgments

This work was supported by the Gachon University research fund of 2012 (GCU-2012-R034).

References (189)

  • F. Erdal et al.

    Optimum design of cellular beams using harmony search and particle swarm optimizers

    J. Constr. Steel Res.

    (2011)
  • M. Fesanghary et al.

    Hybridizing harmony search algorithm with sequential quadratic programming for engineering optimization problems

    Comput. Methods Appl. Mech. Eng.

    (2008)
  • M. Fesanghary et al.

    Design optimization of shell and tube heat exchangers using global sensitivity analysis and harmony search algorithm

    Appl. Therm. Eng.

    (2009)
  • M. Fesanghary et al.

    Design of low-emission and energy-efficient residential buildings using a multi-objective optimization algorithm

    Build. Environ.

    (2012)
  • J. Gao et al.

    A PAPR reduction algorithm based on harmony research for OFDM systems

    Procedia Eng.

    (2011)
  • Z.W. Geem et al.

    Parameter-setting-free harmony search algorithm

    Appl. Math. Comput.

    (2010)
  • Z.W. Geem

    Discussion on combined heat and power economic dispatch by harmony search algorithm

    Int. J. Electr. Power Energy Syst.

    (2011)
  • Z.W. Geem

    Effects of initial memory and identical harmony in global optimization using harmony search algorithm

    Appl. Math. Comput.

    (2012)
  • S. Gil-Lopez et al.

    A hybrid harmony search algorithm for the spread spectrum radar polyphase codes design problem

    Expert Syst. Appl.

    (2012)
  • M. Jaberipour et al.

    Solving the sum-of-ratios problems by a harmony search algorithm

    J. Comput. Appl. Math.

    (2010)
  • A. Kaveh et al.

    Particle swarm optimizer, ant colony strategy and harmony search scheme hybridized for optimization of truss structures

    Comput. Struct.

    (2009)
  • A. Kaveh et al.

    Cost optimization of a composite floor system using an improved harmony search algorithm

    J. Constr. Steel Res.

    (2010)
  • A. Kaveh et al.

    Discrete cost optimization of composite floor system using social harmony search model

    Appl. Soft Comput.

    (2012)
  • E. Khorram et al.

    Harmony search algorithm for solving combined heat and power economic dispatch problems

    Energ. Convers. Manage.

    (2011)
  • Ahmed, A.M., Bakar, A.A., Hamdan, A.R., 2011. Harmony search algorithm for optimal word size in symbolic time series...
  • Al-Betar, M.A., Khader, A.T., Liao, I.Y., 2010. A harmony search with multi-pitch adjusting rate for the university...
  • M.A. Al-Betar et al.

    University course timetabling using a hybrid harmony search metaheuristic algorithm

    IEEE Trans. Syst. Man Cybern: Part C: Appl. Rev.

    (2012)
  • E. Alexandre et al.

    Sound classification in hearing aids by the harmony search algorithm

    Stud. Comput. Intell.

    (2009)
  • O.M. Alia et al.

    The variants of the harmony search algorithman overview

    Artif. Intell. Rev.

    (2011)
  • Alia, O.M., Mandava, R., Aziz, M.E., 2010. A hybrid harmony search algorithm to MRI brain segmentation. In: IEEE...
  • Alsewari, A.A., Zamli, K.Z., 2011. Interaction test data generation using harmony search algorithm. In: IEEE Symposium...
  • A. Askarzadeh et al.

    An innovative global harmony search algorithm for parameter identification of a PEM fuel cell model

    IEEE Trans. Ind. Electron.

    (2012)
  • Ayachi, I., Kammarti, R., Ksouri, M., Borne, P., 2011. Application of harmony search to container storage location. In:...
  • M.T. Ayvaz

    Application of harmony search algorithm to the solution of groundwater management models

    Adv. Water Resour.

    (2008)
  • M.T. Ayvaz

    Identification of groundwater parameter structure using harmony search algorithm

    Stud. Comput. Intell.

    (2009)
  • M.T. Ayvaz

    Solution of groundwater management problems using harmony search algorithm

    Stud. Comput. Intell.

    (2010)
  • Barzegari, M., Bathaee, S.M.T., Mohsen, A., 2010. Optimal coordination of directional overcurrent relays using harmony...
  • G. Bekda et al.

    Estimating optimum parameters of tuned mass dampers using harmony search

    Eng. Struct.

    (2011)
  • G. Bo et al.

    The application of harmony search in fourth-party logistics routing problems

    Stud. Comput. Intell.

    (2010)
  • H. Ceylan et al.

    Harmony search algorithm for transport energy demand modeling

    Stud. Comput. Intell.

    (2009)
  • Ceylan, O., Dag, H., Ozdemir, A., 2010. Harmony search method based parallel contingency analysis. In: International...
  • Chandran, L.P., Nazeer, K.A.A., 2011. An improved clustering algorithm based on K-means and harmony search...
  • A. Chatterjee et al.

    Solution of combined economic and emission dispatch problems of power systems by an apposition-based harmony search algorithm

    Int. J. Electr. Power Energy Syst.

    (2011)
  • Y.M. Cheng

    Modified harmony methods for slope stability problems.music-inspired harmony search algorithm

    Stud. Comput. Intell.

    (2009)
  • Cheng, P., Yong, W., 2010. A hybrid simplex harmony search algorithm and its application to model reduction of linear...
  • M. Cisty

    Application of the harmony search optimization in irrigation

    Stud. Comput. Intell.

    (2010)
  • Cobos, C., Andrade, J., Constain, W., Mendoza, M., Leon, E., 2010. Web document clustering based on global-best harmony...
  • L.S. Coelho et al.

    An improved harmony search algorithm for synchronization of discrete-time chaotic systems

    Chaos Soliton Fract.

    (2009)
  • L.S. Coelho et al.

    A harmony search approach using exponential probability distribution applied to fuzzy logic control optimization

    Stud. Comput. Intell.

    (2010)
  • Coelho, L.S., Bernert, L.A., Mariani, V.C., 2010b. Chaotic differential harmony search algorithm applied to power...
  • Cited by (0)

    View full text