Skip to main content

2021 | OriginalPaper | Buchkapitel

13. Training Multi-layer Perceptron Using Hybridization of Chaotic Gravitational Search Algorithm and Particle Swarm Optimization

verfasst von : Sajad Ahmad Rather, P. Shanthi Bala, Pillai Lekshmi Ashokan

Erschienen in: Applying Particle Swarm Optimization

Verlag: Springer International Publishing

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

A novel amalgamation strategy, namely, chaotic gravitational search algorithm (CGSA) and particle swarm optimization (PSO), has been employed for training multi-layer perceptron (MLP) neural network. It is called CGSAPSO. In CGSAPSO, exploration is carried out by CGSA, and exploitation is performed using PSO. The sigmoid activation function is utilized for training MLP. Besides, a matrix encoding strategy has been used for providing a synergy between neural biases, weights, and CGSAPSO searcher agents. To validate the effectiveness of the hybrid framework, CGSAPSO is applied to three different classification datasets, namely, XOR, Iris, and Balloon. The investigation of results is carried out through various performance metrics like average, standard deviation, median, convergence speed, execution time, and classification rate analysis. Besides, a pair-wise non-parametric signed Wilcoxon rank-sum test has also been conducted for statistical verification of simulation results. In addition, the numerical outcomes of CGSAPSO are also compared with standard GSA, PSO, and hybrid PSOGSA. The experimental results indicate that CGSAPSO provides better results in the form of recognition accuracy and global optima as compared to competing algorithms.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Literatur
Zurück zum Zitat Bebis, G., & Georgiopoulos, M. (1994). Feed-forward neural networks. IEEE Potentials, 13(4), 27–31.CrossRef Bebis, G., & Georgiopoulos, M. (1994). Feed-forward neural networks. IEEE Potentials, 13(4), 27–31.CrossRef
Zurück zum Zitat Derrac, J., García, S., Molina, D., & Herrera, F. (2011). A practical tutorial on the use of nonparametric statistical tests as a methodology for comparing evolutionary and swarm intelligence algorithms. Swarm and Evolutionary Computation, 1(1), 3–18.CrossRef Derrac, J., García, S., Molina, D., & Herrera, F. (2011). A practical tutorial on the use of nonparametric statistical tests as a methodology for comparing evolutionary and swarm intelligence algorithms. Swarm and Evolutionary Computation, 1(1), 3–18.CrossRef
Zurück zum Zitat Dorffner, G. (1996). Neural networks for time series processing. Neural Network World, 6, 447–468. Dorffner, G. (1996). Neural networks for time series processing. Neural Network World, 6, 447–468.
Zurück zum Zitat Green, R. C., II, Wang, L., & Alam, M. (2012). Training neural networks using central force optimization and particle swarm optimization: Insights and comparisons. Expert Systems with Applications, 39(1), 555–563.CrossRef Green, R. C., II, Wang, L., & Alam, M. (2012). Training neural networks using central force optimization and particle swarm optimization: Insights and comparisons. Expert Systems with Applications, 39(1), 555–563.CrossRef
Zurück zum Zitat Gudise, V. G., & Venayagamoorthy, G. K. (2003). Comparison of particle swarm optimization and backpropagation as training algorithms for neural networks. In Proceedings of the 2003 IEEE Swarm Intelligence Symposium, pp. 110–117. Gudise, V. G., & Venayagamoorthy, G. K. (2003). Comparison of particle swarm optimization and backpropagation as training algorithms for neural networks. In Proceedings of the 2003 IEEE Swarm Intelligence Symposium, pp. 110–117.
Zurück zum Zitat Halliday, D., Resnick, R., & Walker, J. (2000). Fundamentals of physics (6th Edition). Delhi: Wiley. Halliday, D., Resnick, R., & Walker, J. (2000). Fundamentals of physics (6th Edition). Delhi: Wiley.
Zurück zum Zitat Huang, M. L., & Chou, Y. C. (2019). Combining a gravitational search algorithm, particle swarm optimization, and fuzzy rules to improve the classification performance of a feed-forward neural network. Computer Methods and Programs in Biomedicine, 180, 105–116. https://doi.org/10.1016/j.cmpb.2019.CrossRef Huang, M. L., & Chou, Y. C. (2019). Combining a gravitational search algorithm, particle swarm optimization, and fuzzy rules to improve the classification performance of a feed-forward neural network. Computer Methods and Programs in Biomedicine, 180, 105–116. https://​doi.​org/​10.​1016/​j.​cmpb.​2019.CrossRef
Zurück zum Zitat Itano, F., DeSousa, M. A. D. A., & Hernandez, E. D. M. (2018). Extending MLP ANN hyperparameters optimization by using genetic algorithm. In Proceedings of the IEEE 2018 International Joint Conference on Neural Networks (IJCNN), pp. 1-8. Itano, F., DeSousa, M. A. D. A., & Hernandez, E. D. M. (2018). Extending MLP ANN hyperparameters optimization by using genetic algorithm. In Proceedings of the IEEE 2018 International Joint Conference on Neural Networks (IJCNN), pp. 1-8.
Zurück zum Zitat Jacobs, R. A. (1988). Increased rates of convergence through learning rate adaptation. Neural Networks, 1(4), 295–307.CrossRef Jacobs, R. A. (1988). Increased rates of convergence through learning rate adaptation. Neural Networks, 1(4), 295–307.CrossRef
Zurück zum Zitat James, J. Q., Lam, A. Y., & Li, V. O. (2011, June). Evolutionary artificial neural network based on chemical reaction optimization. In 2011 IEEE congress of evolutionary computation (CEC) (pp. 2083–2090). New York: IEEE. James, J. Q., Lam, A. Y., & Li, V. O. (2011, June). Evolutionary artificial neural network based on chemical reaction optimization. In 2011 IEEE congress of evolutionary computation (CEC) (pp. 2083–2090). New York: IEEE.
Zurück zum Zitat Karaboga, D., Akay, B., & Ozturk, C. (2007). Artificial bee colony (ABC) optimization algorithm for training feed-forward neural networks. In Springer international conference on modeling decisions for artificial intelligence (pp. 318–329). Berlin: Springer.CrossRef Karaboga, D., Akay, B., & Ozturk, C. (2007). Artificial bee colony (ABC) optimization algorithm for training feed-forward neural networks. In Springer international conference on modeling decisions for artificial intelligence (pp. 318–329). Berlin: Springer.CrossRef
Zurück zum Zitat Kennedy, J., & Eberhart, R. (1995). Particle swarm optimization. In Proceedings of ICNN ’95-IEEE International conference on neural networks (pp. 1942–1948). New York: IEEE.CrossRef Kennedy, J., & Eberhart, R. (1995). Particle swarm optimization. In Proceedings of ICNN ’95-IEEE International conference on neural networks (pp. 1942–1948). New York: IEEE.CrossRef
Zurück zum Zitat Llonen, J., Kamarainen, J. K., & Lampinen, J. (2003). Differential evolution training algorithm for feed-forward neural networks. Neural Processing Letters, 17(1), 93–105.CrossRef Llonen, J., Kamarainen, J. K., & Lampinen, J. (2003). Differential evolution training algorithm for feed-forward neural networks. Neural Processing Letters, 17(1), 93–105.CrossRef
Zurück zum Zitat Markowitz, H. (1952). Portfolio selection. Journal of Finance, 79(1), 77–91. Markowitz, H. (1952). Portfolio selection. Journal of Finance, 79(1), 77–91.
Zurück zum Zitat McCulloch, W. S., & Pitts, W. (1943). A logical calculus of the ideas immanent in nervous activity. Bulletin of Mathematical Biophysics, 5(4), 115–133.CrossRef McCulloch, W. S., & Pitts, W. (1943). A logical calculus of the ideas immanent in nervous activity. Bulletin of Mathematical Biophysics, 5(4), 115–133.CrossRef
Zurück zum Zitat Mendes, R., Cortez, P., Rocha, M., & Neves, J. (2002). Particle swarms for feed-forward neural network training. Proceedings of the IEEE International Joint Conference on Neural Networks, 6, 1895–1899. Mendes, R., Cortez, P., Rocha, M., & Neves, J. (2002). Particle swarms for feed-forward neural network training. Proceedings of the IEEE International Joint Conference on Neural Networks, 6, 1895–1899.
Zurück zum Zitat Mirjalili, S., & Hashim, S. Z. M. (2010). A new hybrid PSOGSA algorithm for function optimization. In 2010 IEEE International Conference on Computer and Information Application, 2010, 374–377.CrossRef Mirjalili, S., & Hashim, S. Z. M. (2010). A new hybrid PSOGSA algorithm for function optimization. In 2010 IEEE International Conference on Computer and Information Application, 2010, 374–377.CrossRef
Zurück zum Zitat Mirjalili, S., Mohd Hashim, S. Z., & Moradian Sardroudi, H. (2012). Training Feed-forward neural networks using hybrid particle swarm optimization and gravitational search algorithm. Applied Mathematics and Computation, 218(22), 11125–11137.CrossRef Mirjalili, S., Mohd Hashim, S. Z., & Moradian Sardroudi, H. (2012). Training Feed-forward neural networks using hybrid particle swarm optimization and gravitational search algorithm. Applied Mathematics and Computation, 218(22), 11125–11137.CrossRef
Zurück zum Zitat Mirjalili, S. M., Abedi, K., & Mirjalili, S. (2013). Optical buffer performance enhancement using particle swarm optimization in ring-shape-hole photonic crystal waveguide. The Optician, 124(23), 5989–5993. Mirjalili, S. M., Abedi, K., & Mirjalili, S. (2013). Optical buffer performance enhancement using particle swarm optimization in ring-shape-hole photonic crystal waveguide. The Optician, 124(23), 5989–5993.
Zurück zum Zitat Ooyen, A., & Nienhuis, B. (1992). Improving the convergence of the backpropagation algorithm. Neural Networks, 5(3), 465–471.CrossRef Ooyen, A., & Nienhuis, B. (1992). Improving the convergence of the backpropagation algorithm. Neural Networks, 5(3), 465–471.CrossRef
Zurück zum Zitat Pereira, L. A., Afonso, L. C., Papa, J. P., Vale, Z. A., Ramos, C. C., Gastaldello, D. S., & Souza, A. N. (2013, April). Multilayer perceptron neural networks training through charged system search and its application for non-technical losses detection. In 2013 IEEE PES Conference on Innovative Smart Grid Technologies (ISGT Latin America) (pp. 1–6). New York: IEEE. Pereira, L. A., Afonso, L. C., Papa, J. P., Vale, Z. A., Ramos, C. C., Gastaldello, D. S., & Souza, A. N. (2013, April). Multilayer perceptron neural networks training through charged system search and its application for non-technical losses detection. In 2013 IEEE PES Conference on Innovative Smart Grid Technologies (ISGT Latin America) (pp. 1–6). New York: IEEE.
Zurück zum Zitat Pereira, L. A., Rodrigues, D., Ribeiro, P. B., Papa, J. P., & Weber, S. A. (2014, May). Social-spider optimization-based artificial neural networks training and its applications for Parkinson’s disease identification. In 2014 IEEE 27th international symposium on computer-based medical systems (pp. 14–17). New York: IEEE.CrossRef Pereira, L. A., Rodrigues, D., Ribeiro, P. B., Papa, J. P., & Weber, S. A. (2014, May). Social-spider optimization-based artificial neural networks training and its applications for Parkinson’s disease identification. In 2014 IEEE 27th international symposium on computer-based medical systems (pp. 14–17). New York: IEEE.CrossRef
Zurück zum Zitat Rashedi, E., Nezamabadi-pour, H., & Saryazdi, S. (2009). GSA: A gravitational search algorithm. Information Sciences, 179(13), 2232–2248.CrossRef Rashedi, E., Nezamabadi-pour, H., & Saryazdi, S. (2009). GSA: A gravitational search algorithm. Information Sciences, 179(13), 2232–2248.CrossRef
Zurück zum Zitat Rather, S. A., & Bala, P. S. (2019c). Hybridization of constriction coefficient based particle swarm optimization and gravitational search algorithm for function optimization. In 2019 Elsevier International Conference on Advances in Electronics, Electrical, and Computational Intelligence (ICAEEC-2019). Amsterdam: Elsevier. https://doi.org/10.2139/ssrn.3576489.CrossRef Rather, S. A., & Bala, P. S. (2019c). Hybridization of constriction coefficient based particle swarm optimization and gravitational search algorithm for function optimization. In 2019 Elsevier International Conference on Advances in Electronics, Electrical, and Computational Intelligence (ICAEEC-2019). Amsterdam: Elsevier. https://​doi.​org/​10.​2139/​ssrn.​3576489.CrossRef
Zurück zum Zitat Rather, S. A., & Bala, P. S. (2020a). A hybrid constriction coefficient based particle swarm optimization and gravitational search algorithm for training multi-layer perceptron (MLP). International Journal of Intelligent Computing and Cybernetics, 13(2), 129–165. https://doi.org/10.1108/JICC-09-2019-0105.CrossRef Rather, S. A., & Bala, P. S. (2020a). A hybrid constriction coefficient based particle swarm optimization and gravitational search algorithm for training multi-layer perceptron (MLP). International Journal of Intelligent Computing and Cybernetics, 13(2), 129–165. https://​doi.​org/​10.​1108/​JICC-09-2019-0105.CrossRef
Zurück zum Zitat Rather, S. A., & Sharma, N. (2017). GSA-BBO hybridization algorithm. International Journal of Advance Research in Science and Engineering, 6, 596–608. Rather, S. A., & Sharma, N. (2017). GSA-BBO hybridization algorithm. International Journal of Advance Research in Science and Engineering, 6, 596–608.
Zurück zum Zitat Saremi, S., Mirjalili, S., & Lewis, A. (2014). Biogeography-based optimization with chaos. Neural Computing and Applications, 25(5), 1077–1097.CrossRef Saremi, S., Mirjalili, S., & Lewis, A. (2014). Biogeography-based optimization with chaos. Neural Computing and Applications, 25(5), 1077–1097.CrossRef
Zurück zum Zitat Simon, D. (2008). Biogeography‐based optimization. IEEE Transactions on Evolutionary Computation, 12, 702–713.CrossRef Simon, D. (2008). Biogeography‐based optimization. IEEE Transactions on Evolutionary Computation, 12, 702–713.CrossRef
Zurück zum Zitat Weir, M. K. (1991). A method doe self-determination of adaptive learning rates in backpropagation. Neural Networks, 4(3), 371–379.CrossRef Weir, M. K. (1991). A method doe self-determination of adaptive learning rates in backpropagation. Neural Networks, 4(3), 371–379.CrossRef
Zurück zum Zitat Wilcoxon, F. (1945). Individual comparisons by ranking methods. Biometrics Bulletin, 1(6), 80–83.CrossRef Wilcoxon, F. (1945). Individual comparisons by ranking methods. Biometrics Bulletin, 1(6), 80–83.CrossRef
Zurück zum Zitat Clerc, M., & Kennedy, J. (2002). The particle swarm-explosion, stability, and convergence in a multidimensional complex space. IEEE Transactions on Evolutionary Computation, 6(1), 58–73. Clerc, M., & Kennedy, J. (2002). The particle swarm-explosion, stability, and convergence in a multidimensional complex space. IEEE Transactions on Evolutionary Computation, 6(1), 58–73.
Zurück zum Zitat Kalinlia, A., & Karabogab, N. (2005). Artificial immune algorithm for IIR filter design Engineering. Applications of Artificial Intelligence, 18, 919–929.CrossRef Kalinlia, A., & Karabogab, N. (2005). Artificial immune algorithm for IIR filter design Engineering. Applications of Artificial Intelligence, 18, 919–929.CrossRef
Zurück zum Zitat Kirkpatrick, S., Gelatto, C. D., & Vecchi, M. P. (1983). Optimization by simulated annealing. Science, 220, 671–680.CrossRef Kirkpatrick, S., Gelatto, C. D., & Vecchi, M. P. (1983). Optimization by simulated annealing. Science, 220, 671–680.CrossRef
Zurück zum Zitat Lavika, S., Sharthak, & Satyajit. (2016). Hybridization of gravitational search algorithm and biogeography based optimization and its application on grid scheduling problem. Ninth International Conference on Contemporary Computing (IC3), 2016, 1–6. Lavika, S., Sharthak, & Satyajit. (2016). Hybridization of gravitational search algorithm and biogeography based optimization and its application on grid scheduling problem. Ninth International Conference on Contemporary Computing (IC3), 2016, 1–6.
Zurück zum Zitat Li, C., & Zhou, J. (2011). Parameters identification of hydraulic turbine governing system using improved gravitational search algorithm. Energy Conversion and Management, 52, 374–381.CrossRef Li, C., & Zhou, J. (2011). Parameters identification of hydraulic turbine governing system using improved gravitational search algorithm. Energy Conversion and Management, 52, 374–381.CrossRef
Zurück zum Zitat Tang, K. S., Man, K. F., Kwong, S., & He, Q. (1996). Genetic algorithms and their applications. IEEE Signal Processing Magazine, 13, 22–37.CrossRef Tang, K. S., Man, K. F., Kwong, S., & He, Q. (1996). Genetic algorithms and their applications. IEEE Signal Processing Magazine, 13, 22–37.CrossRef
Zurück zum Zitat Wolpert, D. H., & Macready, W. G. (1997). No free lunch theorems for optimization. IEEE Transactions on Evolutionary Computation, 1(1), 67–82.CrossRef Wolpert, D. H., & Macready, W. G. (1997). No free lunch theorems for optimization. IEEE Transactions on Evolutionary Computation, 1(1), 67–82.CrossRef
Zurück zum Zitat Yang, L. J., & Chen, T. L. (2002). Application of chaos in genetic algorithms. Communications on Theoretical Physics, 38(2), 168–172.CrossRef Yang, L. J., & Chen, T. L. (2002). Application of chaos in genetic algorithms. Communications on Theoretical Physics, 38(2), 168–172.CrossRef
Metadaten
Titel
Training Multi-layer Perceptron Using Hybridization of Chaotic Gravitational Search Algorithm and Particle Swarm Optimization
verfasst von
Sajad Ahmad Rather
P. Shanthi Bala
Pillai Lekshmi Ashokan
Copyright-Jahr
2021
Verlag
Springer International Publishing
DOI
https://doi.org/10.1007/978-3-030-70281-6_13