Skip to main content

2014 | OriginalPaper | Buchkapitel

5. Improving Global Convergence

verfasst von : Serkan Kiranyaz, Turker Ince, Moncef Gabbouj

Erschienen in: Multidimensional Particle Swarm Optimization for Machine Learning and Pattern Recognition

Verlag: Springer Berlin Heidelberg

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

As a natural extension of PSO, MD PSO may also have a serious drawback of premature convergence to a local optimum, due to the direct link of the information flow between particles and gbest, which “guides” the rest of the swarm resulting in possible loss of diversity. Hence, this phenomenon increases the probability of being trapped in local optima [1] and it is the main cause of the premature convergence problem especially when the search space is of high dimensions [2] and the problem to be optimized is multimodal [1]. Another reason for the premature convergence is that particles are flown through a single point which is (randomly) determined by gbest and pbest positions and this point is not even guaranteed to be a local optimum [3]. Various modifications and PSO variants have been proposed in order to address this problem such as [1, 328]. As briefly discussed in Sect.​ 3.​3, such methods usually try to improve the diversity among the particles and the search mechanism either by changing the update equations toward a more diversified version, by adding more randomization to the system (to particle velocities, positions, etc.), or simply resetting some or all particles randomly when some conditions are met. On the one hand, most of these variants require additional parameters to accomplish the task and thus making the algorithms even more parameter dependent. On the other hand, the main problem is in fact the inability of the algorithm to use available diversity in one or more positional components of a particle. Note that one or more components of any particle may already be in a close vicinity of the global optimum. This potential is then wasted with the (velocity) update in the next iteration, which changes all the components at once. In this chapter, we shall address this drawback of global convergence by developing two efficient techniques. The first one, the so-called Fractional Global Best Formation (FGBF), collects all such promising (or simply the best) components from each particle and fractionally creates an artificial global best (GB) candidate, the aGB, which will be the swarm’s global best (GB) particle if it is better than the previous GB and the just-computed gbest. Note that whenever a better gbest particle or aGB particle emerges, it will replace the current GB particle. Without any additional change, we shall show that FGBF can avoid local optima and thus yield the optimum (or near optimum) solution efficiently even in high dimensional search spaces. Unfortunately FGBF is not an entirely generic technique, which should be specifically adapted to the problem at hand (we shall return to this issue later). In order to address this drawback efficiently, we shall further present two generic approaches, one of which moves gbest efficiently or simply put, “guides” it with respect to the function (or error surface) it resides on. The idea behind this is quite simple: since the velocity update equation of gbest is quite poor, we shall replace it with a simple yet powerful stochastic search technique to guide it instead. We shall henceforth show that due to the stochastic nature of the search technique, the likelihood of getting trapped into a local optimum can significantly be decreased.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Fußnoten
1
In Maeda and Kuratani [46], the function evaluations are given with respect to the iteration number; however, it should have been noted that SP-PSO performs twice more evaluations than bPSO per iteration. Considering this fact, the plots therein show little or no performance improvement at all.
 
Literatur
1.
Zurück zum Zitat J. Riget, J.S. Vesterstrom, A diversity-guided particle swarm optimizer—The ARPSO, Technical report, Department of Computer Science, University of Aarhus, 2002 J. Riget, J.S. Vesterstrom, A diversity-guided particle swarm optimizer—The ARPSO, Technical report, Department of Computer Science, University of Aarhus, 2002
3.
Zurück zum Zitat F. Van den Bergh, A.P. Engelbrecht, A new locally convergent particle swarm optimizer, in Proceedings of the IEEE International Conference on Systems, Man, and Cybernetics, (2002), pp. 96–101 F. Van den Bergh, A.P. Engelbrecht, A new locally convergent particle swarm optimizer, in Proceedings of the IEEE International Conference on Systems, Man, and Cybernetics, (2002), pp. 96–101
4.
Zurück zum Zitat A. Abraham, S. Das, S. Roy, Swarm intelligence algorithms for data clustering. in Soft Computing for Knowledge Discovery and Data Mining book, Part IV, (2007), pp. 279–313, Oct 25 2007 A. Abraham, S. Das, S. Roy, Swarm intelligence algorithms for data clustering. in Soft Computing for Knowledge Discovery and Data Mining book, Part IV, (2007), pp. 279–313, Oct 25 2007
5.
Zurück zum Zitat T.M. Blackwell, Particle swarm optimization in dynamic environments. Evolutionary Computation in Dynamic and Uncertain Environments, Studies in Computational Intelligence, vol. 51 (Springer, Berlin, 2007) pp. 29–49 T.M. Blackwell, Particle swarm optimization in dynamic environments. Evolutionary Computation in Dynamic and Uncertain Environments, Studies in Computational Intelligence, vol. 51 (Springer, Berlin, 2007) pp. 29–49
6.
Zurück zum Zitat Y.-P. Chen, W.-C. Peng, M.-C. Jian, Particle swarm optimization with recombination and dynamic linkage discovery. IEEE Trans. Syst. Man Cybern. Part B 37(6), 1460–1470 (2007)CrossRef Y.-P. Chen, W.-C. Peng, M.-C. Jian, Particle swarm optimization with recombination and dynamic linkage discovery. IEEE Trans. Syst. Man Cybern. Part B 37(6), 1460–1470 (2007)CrossRef
7.
Zurück zum Zitat K.M. Christopher, K.D. Seppi, The Kalman swarm. A new approach to particle motion in swarm optimization, in Proceedings of the Genetic and Evolutionary Computation Conference, GECCO, (2004), pp. 140–150 K.M. Christopher, K.D. Seppi, The Kalman swarm. A new approach to particle motion in swarm optimization, in Proceedings of the Genetic and Evolutionary Computation Conference, GECCO, (2004), pp. 140–150
8.
Zurück zum Zitat L.-Y. Chuang, H.W. Chang, C.J. Tu, C.H. Yang, Improved binary PSO for feature selection using gene expression data. Comput. Biol. Chem. 32(1), 29–38 (2008)MATHCrossRef L.-Y. Chuang, H.W. Chang, C.J. Tu, C.H. Yang, Improved binary PSO for feature selection using gene expression data. Comput. Biol. Chem. 32(1), 29–38 (2008)MATHCrossRef
9.
Zurück zum Zitat R. Eberhart, P. Simpson, R. Dobbins, Computational Intelligence,PC Tools (Academic, Boston, 1996) R. Eberhart, P. Simpson, R. Dobbins, Computational Intelligence,PC Tools (Academic, Boston, 1996)
10.
Zurück zum Zitat H. Higashi, H. Iba, Particle Swarm Optimization with Gaussian Mutation, in Proceedings of the IEEE swarm intelligence symposium, (2003), pp. 72–79 H. Higashi, H. Iba, Particle Swarm Optimization with Gaussian Mutation, in Proceedings of the IEEE swarm intelligence symposium, (2003), pp. 72–79
11.
Zurück zum Zitat S. Janson, M. Middendorf, A hierarchical particle swarm optimizer and its adaptive variant. IEEE Trans. Syst. Man Cybern. Part B 35(6), 1272–1282 (2005)CrossRef S. Janson, M. Middendorf, A hierarchical particle swarm optimizer and its adaptive variant. IEEE Trans. Syst. Man Cybern. Part B 35(6), 1272–1282 (2005)CrossRef
12.
Zurück zum Zitat B. Kaewkamnerdpong, P.J. Bentley, Perceptive particle swarm optimization: an investigation, in Proceedings of IEEE Swarm Intelligence Symposium, (California, 2005), pp. 169–176, 8–10 June 2005 B. Kaewkamnerdpong, P.J. Bentley, Perceptive particle swarm optimization: an investigation, in Proceedings of IEEE Swarm Intelligence Symposium, (California, 2005), pp. 169–176, 8–10 June 2005
13.
Zurück zum Zitat U. Kressel, Pairwise Classification and support vector machines. in Advances in Kernel Methods—Support Vector Learning (1999) U. Kressel, Pairwise Classification and support vector machines. in Advances in Kernel MethodsSupport Vector Learning (1999)
14.
Zurück zum Zitat J.J. Liang, A.K. Qin, Comprehensive learning particle swarm optimizer for global optimization of multimodal functions. IEEE Trans. Evol. Comput. 10(3), 281–295 (2006)CrossRef J.J. Liang, A.K. Qin, Comprehensive learning particle swarm optimizer for global optimization of multimodal functions. IEEE Trans. Evol. Comput. 10(3), 281–295 (2006)CrossRef
15.
Zurück zum Zitat Y. Lin, B. Bhanu, Evolutionary feature synthesis for object recognition. IEEE Trans. Man Cybern. Part C 35(2), 156–171 (2005)CrossRef Y. Lin, B. Bhanu, Evolutionary feature synthesis for object recognition. IEEE Trans. Man Cybern. Part C 35(2), 156–171 (2005)CrossRef
16.
Zurück zum Zitat M. Løvberg, T. Krink, Extending particle swarm optimizers with self-organized criticality, in Proceedings of the IEEE Congress on Evolutionary Computation, vol. 2 (2002), pp. 1588–1593 M. Løvberg, T. Krink, Extending particle swarm optimizers with self-organized criticality, in Proceedings of the IEEE Congress on Evolutionary Computation, vol. 2 (2002), pp. 1588–1593
17.
Zurück zum Zitat W. McCulloch, W. Pitts, A logical calculus of the ideas immanent in nervous activity. Bull. Math. Biophys. 7, 115–133 (1943)MathSciNetCrossRef W. McCulloch, W. Pitts, A logical calculus of the ideas immanent in nervous activity. Bull. Math. Biophys. 7, 115–133 (1943)MathSciNetCrossRef
18.
Zurück zum Zitat J. Pan, W.J. Tompkins, A real-time QRS detection algorithm. IEEE Trans. Biomed. Eng. 32(3), 230–236 (1985)CrossRef J. Pan, W.J. Tompkins, A real-time QRS detection algorithm. IEEE Trans. Biomed. Eng. 32(3), 230–236 (1985)CrossRef
19.
Zurück zum Zitat T. Peram, K. Veeramachaneni, C.K. Mohan, Fitness-distance-ratio based particle swarm optimization, in Proceedings of the IEEE Swarm Intelligence Symposium, (IEEE Press, 2003) pp. 174–181 T. Peram, K. Veeramachaneni, C.K. Mohan, Fitness-distance-ratio based particle swarm optimization, in Proceedings of the IEEE Swarm Intelligence Symposium, (IEEE Press, 2003) pp. 174–181
20.
Zurück zum Zitat K. Price, R.M. Storn, J.A. Lampinen, Differential Evolution: A Practical Approach to Global Optimization (Springer, Berlin, 2005). ISBN 978-3-540-20950-8 K. Price, R.M. Storn, J.A. Lampinen, Differential Evolution: A Practical Approach to Global Optimization (Springer, Berlin, 2005). ISBN 978-3-540-20950-8
21.
Zurück zum Zitat A.C. Ratnaweera, S.K. Halgamuge, H.C. Watson, Particle swarm optimiser with time varying acceleration coefficients, in Proceedings of the International Conference on Soft Computing and Intelligent Systems, (2002), pp. 240–255 A.C. Ratnaweera, S.K. Halgamuge, H.C. Watson, Particle swarm optimiser with time varying acceleration coefficients, in Proceedings of the International Conference on Soft Computing and Intelligent Systems, (2002), pp. 240–255
22.
Zurück zum Zitat Y. Shi, R.C. Eberhart, A modified particle swarm optimizer, in Proceedings of the IEEE Congress on Evolutionary Computation, (1998), pp. 69–73 Y. Shi, R.C. Eberhart, A modified particle swarm optimizer, in Proceedings of the IEEE Congress on Evolutionary Computation, (1998), pp. 69–73
23.
Zurück zum Zitat Y. Shi, R.C. Eberhart, Fuzzy adaptive particle swarm optimization. in Proceedings of the IEEE Congress on Evolutionary Computation, (IEEE Press, 2001), vol. 1, pp. 101–106 Y. Shi, R.C. Eberhart, Fuzzy adaptive particle swarm optimization. in Proceedings of the IEEE Congress on Evolutionary Computation, (IEEE Press, 2001), vol. 1, pp. 101–106
24.
Zurück zum Zitat F. van den Bergh, A.P. Engelbrecht, A cooperative approach to particle swarm optimization. IEEE Trans. Evol. Comput. 3, 225–239 (2004) F. van den Bergh, A.P. Engelbrecht, A cooperative approach to particle swarm optimization. IEEE Trans. Evol. Comput. 3, 225–239 (2004)
25.
Zurück zum Zitat X. Xie, W. Zhang, Z. Yang, A dissipative particle swarm optimization, in Proceedings of the IEEE Congress on Evolutionary Computation, vol. 2 (2002), pp. 1456–1461 X. Xie, W. Zhang, Z. Yang, A dissipative particle swarm optimization, in Proceedings of the IEEE Congress on Evolutionary Computation, vol. 2 (2002), pp. 1456–1461
26.
Zurück zum Zitat X. Xie, W. Zhang, Z. Yang, Adaptive particle swarm optimization on individual level, in Proceedings of the Sixth International Conference on Signal Processing, vol. 2 (2002), pp. 1215–1218 X. Xie, W. Zhang, Z. Yang, Adaptive particle swarm optimization on individual level, in Proceedings of the Sixth International Conference on Signal Processing, vol. 2 (2002), pp. 1215–1218
27.
Zurück zum Zitat X. Xie, W. Zhang, Z. Yang, Hybrid particle swarm optimizer with mass extinction, in Proceedings of the International Conference on Communication, Circuits and Systems, vol. 2 (2002), pp. 1170–1173 X. Xie, W. Zhang, Z. Yang, Hybrid particle swarm optimizer with mass extinction, in Proceedings of the International Conference on Communication, Circuits and Systems, vol. 2 (2002), pp. 1170–1173
28.
Zurück zum Zitat W.-J. Zhang, X.-F. Xie, DEPSO: Hybrid particle swarm with differential evolution operator, in Proceedings of the IEEE International Conference on System, Man, and Cybernetics, vol. 4 (2003), pp. 3816–3821 W.-J. Zhang, X.-F. Xie, DEPSO: Hybrid particle swarm with differential evolution operator, in Proceedings of the IEEE International Conference on System, Man, and Cybernetics, vol. 4 (2003), pp. 3816–3821
29.
Zurück zum Zitat P.I. Angeline, Evolutionary optimization versus particle swarm optimization: Philosophy and performance differences. in Evolutionary Programming VII, Conference EP’98, Springer Verlag, Lecture Notes in Computer Science No. 1447, (California, USA, 1998). pp. 410–601 P.I. Angeline, Evolutionary optimization versus particle swarm optimization: Philosophy and performance differences. in Evolutionary Programming VII, Conference EP98, Springer Verlag, Lecture Notes in Computer Science No. 1447, (California, USA, 1998). pp. 410–601
30.
Zurück zum Zitat S.C. Esquivel, C.A. Coello Coello, On the use of particle swarm optimization with multimodal functions, in Proceedings of 1106 the IEEE Congress on Evolutionary Computation, (IEEE Press, 2003), pp. 1130–1136 S.C. Esquivel, C.A. Coello Coello, On the use of particle swarm optimization with multimodal functions, in Proceedings of 1106 the IEEE Congress on Evolutionary Computation, (IEEE Press, 2003), pp. 1130–1136
31.
Zurück zum Zitat M. Richards, D. Ventura, Dynamic sociometry in particle swarm optimization, in Proceedings of the Sixth International Conference on Computational Intelligence and Natural Computing, (North Carolina, 2003) pp. 1557–1560 M. Richards, D. Ventura, Dynamic sociometry in particle swarm optimization, in Proceedings of the Sixth International Conference on Computational Intelligence and Natural Computing, (North Carolina, 2003) pp. 1557–1560
32.
Zurück zum Zitat P.J. Angeline, Using Selection to Improve Particle Swarm Optimization, in Proceedings of the IEEE Congress on Evolutionary Computation, (IEEE Press, 1998), pp. 84–89 P.J. Angeline, Using Selection to Improve Particle Swarm Optimization, in Proceedings of the IEEE Congress on Evolutionary Computation, (IEEE Press, 1998), pp. 84–89
34.
Zurück zum Zitat T.M. Blackwell, J. Branke, Multi-Swarm Optimization in Dynamic Environments. Applications of Evolutionary Computation, vol. 3005 (Springer, Berlin, 2004), pp. 489–500 T.M. Blackwell, J. Branke, Multi-Swarm Optimization in Dynamic Environments. Applications of Evolutionary Computation, vol. 3005 (Springer, Berlin, 2004), pp. 489–500
35.
Zurück zum Zitat T.M. Blackwell, J. Branke, Multiswarms, exclusion, and anti-convergence in dynamic environments. IEEE Trans. Evol. Comput. 10(4), 51–58 (2004) T.M. Blackwell, J. Branke, Multiswarms, exclusion, and anti-convergence in dynamic environments. IEEE Trans. Evol. Comput. 10(4), 51–58 (2004)
36.
Zurück zum Zitat X. Li, J. Branke, T. Blackwell, Particle swarm with speciation and adaptation in a dynamic environment, in Proceedings of Genetic and Evolutionary Computation Conference, (Seattle Washington, 2006), pp. 51–58 X. Li, J. Branke, T. Blackwell, Particle swarm with speciation and adaptation in a dynamic environment, in Proceedings of Genetic and Evolutionary Computation Conference, (Seattle Washington, 2006), pp. 51–58
37.
Zurück zum Zitat R. Mendes, A. Mohais, DynDE: a differential evolution for dynamic optimization problems. IEEE Congress on Evolutionary Computation, (2005) pp. 2808–2815 R. Mendes, A. Mohais, DynDE: a differential evolution for dynamic optimization problems. IEEE Congress on Evolutionary Computation, (2005) pp. 2808–2815
38.
Zurück zum Zitat I. Moser, T. Hendtlass, A simple and efficient multi-component algorithm for solving dynamic function optimisation problems. IEEE Congress on Evolutionary Computation, (2007), pp. 252–259 I. Moser, T. Hendtlass, A simple and efficient multi-component algorithm for solving dynamic function optimisation problems. IEEE Congress on Evolutionary Computation, (2007), pp. 252–259
39.
Zurück zum Zitat M.A. Styblinski, T.-S. Tang, Experiments in nonconvex optimization: stochastic approximation with function smoothing and simulated annealing. Neural Netw. 3(4), 467–483 (1990) M.A. Styblinski, T.-S. Tang, Experiments in nonconvex optimization: stochastic approximation with function smoothing and simulated annealing. Neural Netw. 3(4), 467–483 (1990)
40.
Zurück zum Zitat H.J. Kushner, G.G. Yin, Stochastic Approximation Algorithms and Applications (Springer, New York, 1997)MATH H.J. Kushner, G.G. Yin, Stochastic Approximation Algorithms and Applications (Springer, New York, 1997)MATH
41.
Zurück zum Zitat S.B. Gelfand, S.K. Mitter, Recursive stochastic algorithms for global optimization. SIAM J. Control Optim. 29(5), 999–1018 (1991) S.B. Gelfand, S.K. Mitter, Recursive stochastic algorithms for global optimization. SIAM J. Control Optim. 29(5), 999–1018 (1991)
42.
Zurück zum Zitat D.C. Chin, A more efficient global optimization algorithm based on Styblinski and Tang. Neural Networks, (1994), pp. 573–574 D.C. Chin, A more efficient global optimization algorithm based on Styblinski and Tang. Neural Networks, (1994), pp. 573–574
43.
Zurück zum Zitat J.C. Spall, Multivariate stochastic approximation using a simultaneous perturbation gradient approximation. IEEE Transactions on Automatic Control, 37(3), 332–341 (1992) J.C. Spall, Multivariate stochastic approximation using a simultaneous perturbation gradient approximation. IEEE Transactions on Automatic Control, 37(3), 332–341 (1992)
44.
Zurück zum Zitat J.C. Spall, Implementation of the simultaneous perturbation algorithm for stochastic optimization. IEEE Trans. Aerosp. Electron. Syst. 34, 817–823 (1998)CrossRef J.C. Spall, Implementation of the simultaneous perturbation algorithm for stochastic optimization. IEEE Trans. Aerosp. Electron. Syst. 34, 817–823 (1998)CrossRef
45.
Zurück zum Zitat J.L. Maryak, D.C. Chin, Global random optimization by simultaneous perturbation stochastic approximation, in Proceedings of the 33rd Conference on Winter Simulation, (Washington, DC, 2001), pp. 307–312 J.L. Maryak, D.C. Chin, Global random optimization by simultaneous perturbation stochastic approximation, in Proceedings of the 33rd Conference on Winter Simulation, (Washington, DC, 2001), pp. 307–312
46.
Zurück zum Zitat Y. Maeda, T. Kuratani, Simultaneous Perturbation Particle Swarm Optimization. IEEE Congress on Evolutionary Computation, CEC’06, (2006) pp. 672–676 Y. Maeda, T. Kuratani, Simultaneous Perturbation Particle Swarm Optimization. IEEE Congress on Evolutionary Computation, CEC06, (2006) pp. 672–676
Metadaten
Titel
Improving Global Convergence
verfasst von
Serkan Kiranyaz
Turker Ince
Moncef Gabbouj
Copyright-Jahr
2014
Verlag
Springer Berlin Heidelberg
DOI
https://doi.org/10.1007/978-3-642-37846-1_5

Premium Partner