Skip to main content
Erschienen in: Soft Computing 20/2020

19.03.2020 | Methodologies and Application

A hybrid grasshopper and new cat swarm optimization algorithm for feature selection and optimization of multi-layer perceptron

verfasst von: Priti Bansal, Sachin Kumar, Sagar Pasrija, Sachin Singh

Erschienen in: Soft Computing | Ausgabe 20/2020

Einloggen

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

The classification accuracy of a multi-layer perceptron (MLP) depends on the selection of relevant features from the data set, its architecture, connection weights and the transfer functions. Generating an optimal value of all these parameters together is a complex task. Metaheuristic algorithms are popular choice among researchers to solve complex optimization problems. This paper presents a hybrid metaheuristic algorithm simple matching-grasshopper new cat swarm optimization algorithm (SM-GNCSOA) that optimizes all the four components simultaneously. SM-GNCSOA uses grasshopper optimization algorithm, a new variant of binary grasshopper optimization algorithm called simple matching-binary grasshopper optimization algorithm and a new variant of cat swarm optimization algorithm called new cat swarm optimization algorithm to generate an optimal MLP. Features play a vital role in determining the classification accuracy of a classifier. Here, we propose a new feature penalty function and use it in SM-GNCSOA to prevent underfitting or overfitting due to the selected number of features. To evaluate the performance of SM-GNCSOA, different variants of SM-GNCSOA are proposed and their classification accuracies are compared with SM-GNCSOA on ten classification data sets. The results show that SM-GNCSOA gives better results on most of the data sets due to its capability to balance exploration and exploitation and to avoid local minima.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Literatur
Zurück zum Zitat Aljarah I, Faris H, Mirjalili S (2018) Optimizing connection weights in neural networks using the whale optimization algorithm. Soft Comput 22:1–15CrossRef Aljarah I, Faris H, Mirjalili S (2018) Optimizing connection weights in neural networks using the whale optimization algorithm. Soft Comput 22:1–15CrossRef
Zurück zum Zitat Cai J, Luo J, Wang S, Yang S (2018) Feature selection in machine learning: a new perspective. Neurocomputing 300:70–79CrossRef Cai J, Luo J, Wang S, Yang S (2018) Feature selection in machine learning: a new perspective. Neurocomputing 300:70–79CrossRef
Zurück zum Zitat Carvalho M, Ludermir T (2007) Particle swarm optimization of neural network architectures and weights. In 7th International conference on hybrid intelligent systems, pp 336–339 Carvalho M, Ludermir T (2007) Particle swarm optimization of neural network architectures and weights. In 7th International conference on hybrid intelligent systems, pp 336–339
Zurück zum Zitat Chen L-H, Zhang XY (2009) Application of artificial neural network to classify water quality of the yellow river. In: Fuzzy information and engineering. Advances in soft computing, vol 54, pp 15–23 Chen L-H, Zhang XY (2009) Application of artificial neural network to classify water quality of the yellow river. In: Fuzzy information and engineering. Advances in soft computing, vol 54, pp 15–23
Zurück zum Zitat Choi S, Cha S, Tappert CC (2010) A survey of binary similarity and distance measures. J Syst Cybern Inform 8(1):43–48 Choi S, Cha S, Tappert CC (2010) A survey of binary similarity and distance measures. J Syst Cybern Inform 8(1):43–48
Zurück zum Zitat Chu SC, Tsai PW, Pan JS (2006) Cat swarm optimization. In: Pacific Rim international conference on artificial intelligence. Springer, Berlin, pp 854–858 Chu SC, Tsai PW, Pan JS (2006) Cat swarm optimization. In: Pacific Rim international conference on artificial intelligence. Springer, Berlin, pp 854–858
Zurück zum Zitat Ewees AA, Elaziz MA, Houssein EH (2018) Improved grasshopper optimization algorithm using opposition-based learning. Expert Syst Appl 112:156–172CrossRef Ewees AA, Elaziz MA, Houssein EH (2018) Improved grasshopper optimization algorithm using opposition-based learning. Expert Syst Appl 112:156–172CrossRef
Zurück zum Zitat Faris H, Aljarah I, Mirjalili S (2016) Training feedforward neural networks using multi-verse optimizer for binary classification problems. Appl Intell 45(2):322–332CrossRef Faris H, Aljarah I, Mirjalili S (2016) Training feedforward neural networks using multi-verse optimizer for binary classification problems. Appl Intell 45(2):322–332CrossRef
Zurück zum Zitat Faris H, Aljarah I, Mirjalili S (2017) Improved monarch butterfly optimization for unconstrained global search and neural network training. Appl Intell 48:445–468CrossRef Faris H, Aljarah I, Mirjalili S (2017) Improved monarch butterfly optimization for unconstrained global search and neural network training. Appl Intell 48:445–468CrossRef
Zurück zum Zitat Faris H, Mafarja MM, Heidari AA, Aljarah I, Ala’M AZ, Mirjalili S, Fujita H (2018) An efficient binary salp swarm algorithm with crossover scheme for feature selection problem. Knowl-Based Syst 154:43–67CrossRef Faris H, Mafarja MM, Heidari AA, Aljarah I, Ala’M AZ, Mirjalili S, Fujita H (2018) An efficient binary salp swarm algorithm with crossover scheme for feature selection problem. Knowl-Based Syst 154:43–67CrossRef
Zurück zum Zitat Gandomi AH, Alavi AH (2012) Krill herd: a new bio-inspired optimization algorithm. Commun Nonlinear Sci Numer Simul 17(12):4831–4845MathSciNetMATHCrossRef Gandomi AH, Alavi AH (2012) Krill herd: a new bio-inspired optimization algorithm. Commun Nonlinear Sci Numer Simul 17(12):4831–4845MathSciNetMATHCrossRef
Zurück zum Zitat Garro BA, Sossa H, Vazquez RA (2011) Artificial neural network synthesis by means of artificial bee colony (abc) algorithm. In: Proceedings of the IEEE congress on evolutionary computation (CEC’11), pp 331–338 Garro BA, Sossa H, Vazquez RA (2011) Artificial neural network synthesis by means of artificial bee colony (abc) algorithm. In: Proceedings of the IEEE congress on evolutionary computation (CEC’11), pp 331–338
Zurück zum Zitat Guo L, Meng Z, Sun Y, Wang L (2018) A modified cat swarm optimization based maximum power point tracking method for photovoltaic system under partially shaded condition. Energy 144:501–514CrossRef Guo L, Meng Z, Sun Y, Wang L (2018) A modified cat swarm optimization based maximum power point tracking method for photovoltaic system under partially shaded condition. Energy 144:501–514CrossRef
Zurück zum Zitat Han J, Kamber M (2006) Data mining: concepts and techniques. Elsevier Inc (chapter 7) Han J, Kamber M (2006) Data mining: concepts and techniques. Elsevier Inc (chapter 7)
Zurück zum Zitat Heaton J (2008) Introduction to neural networks with java Heaton J (2008) Introduction to neural networks with java
Zurück zum Zitat Hong CM, Chen CM, Fan HK (1999) A new gradient-based search method: grey-gradient search method. In: Imam I, Kodratoff Y, El-Dessouki A, Ali M (eds) Multiple approaches to intelligent systems. IEA/AIE 1999. Lecture notes in computer science, vol 1611. Springer, Berlin Hong CM, Chen CM, Fan HK (1999) A new gradient-based search method: grey-gradient search method. In: Imam I, Kodratoff Y, El-Dessouki A, Ali M (eds) Multiple approaches to intelligent systems. IEA/AIE 1999. Lecture notes in computer science, vol 1611. Springer, Berlin
Zurück zum Zitat Islam MM, Sattar MA, Amin MF, Yao X, Murase K (2009a) A new constructive algorithm for architectural and functional adaptation of artificial neural networks. IEEE Trans Syst Man Cybern Part B Cybern 39(6):1590–1605CrossRef Islam MM, Sattar MA, Amin MF, Yao X, Murase K (2009a) A new constructive algorithm for architectural and functional adaptation of artificial neural networks. IEEE Trans Syst Man Cybern Part B Cybern 39(6):1590–1605CrossRef
Zurück zum Zitat Islam MM, Sattar MA, Amin MF, Yao X, Murase K (2009b) A new adaptive merging and growing algorithm for designing artificial neural networks. IEEE Trans Syst Man Cybern Part B Cybern 39(3):705–722CrossRef Islam MM, Sattar MA, Amin MF, Yao X, Murase K (2009b) A new adaptive merging and growing algorithm for designing artificial neural networks. IEEE Trans Syst Man Cybern Part B Cybern 39(3):705–722CrossRef
Zurück zum Zitat Jaddi NS, Abdullah S, Hamdan AR (2015a) Multi-population cooperative bat algorithm-based optimization of artificial neural network model. Inf Sci 294:628–644MathSciNetCrossRef Jaddi NS, Abdullah S, Hamdan AR (2015a) Multi-population cooperative bat algorithm-based optimization of artificial neural network model. Inf Sci 294:628–644MathSciNetCrossRef
Zurück zum Zitat Jaddi NS, Abdullah S, Hamdan AR (2015b) Optimization of neural network model using modified bat-inspired algorithm. Appl Soft Comput 37:71–86CrossRef Jaddi NS, Abdullah S, Hamdan AR (2015b) Optimization of neural network model using modified bat-inspired algorithm. Appl Soft Comput 37:71–86CrossRef
Zurück zum Zitat Karaboga D (2005) An idea based on honey bee swarm for numerical optimization. Technical report-TR-06. Engineering Faculty, Computer Engineering Department, Erciyes University Karaboga D (2005) An idea based on honey bee swarm for numerical optimization. Technical report-TR-06. Engineering Faculty, Computer Engineering Department, Erciyes University
Zurück zum Zitat Karaboga D, Akay B, Ozturk C (2007) Artificial bee colony (ABC) optimization algorithm for training feed-forward neural networks. In: Modeling decisions for artificial intelligence. Springer, Berlin, pp 318–329 Karaboga D, Akay B, Ozturk C (2007) Artificial bee colony (ABC) optimization algorithm for training feed-forward neural networks. In: Modeling decisions for artificial intelligence. Springer, Berlin, pp 318–329
Zurück zum Zitat Karlik B, Olgac AV (2010) Performance analysis of various activation functions in generalized MLP architectures of neural networks. Int J Artif Intell Expert Syst 1:111–122 Karlik B, Olgac AV (2010) Performance analysis of various activation functions in generalized MLP architectures of neural networks. Int J Artif Intell Expert Syst 1:111–122
Zurück zum Zitat Katrutsa A, Strijov V (2017) Comprehensive study of feature selection methods to solve multicollinearity problem according to evaluation criteria. Expert Syst Appl 76:1–11CrossRef Katrutsa A, Strijov V (2017) Comprehensive study of feature selection methods to solve multicollinearity problem according to evaluation criteria. Expert Syst Appl 76:1–11CrossRef
Zurück zum Zitat Kohavi R, John G (1997) Wrappers for feature subset selection. Artif Intell 97(12):273–324MATHCrossRef Kohavi R, John G (1997) Wrappers for feature subset selection. Artif Intell 97(12):273–324MATHCrossRef
Zurück zum Zitat Kumar Y, Singh PK (2018) Improved cat swarm optimization algorithm for solving global optimization problems and its application to clustering. Appl Intell 48(9):2681–2697CrossRef Kumar Y, Singh PK (2018) Improved cat swarm optimization algorithm for solving global optimization problems and its application to clustering. Appl Intell 48(9):2681–2697CrossRef
Zurück zum Zitat Lauret P, Fock E, Mara TA (2006) A node pruning algorithm based on a Fourier amplitude sensitivity test method. IEEE Trans Neural Netw 17(2):273–293CrossRef Lauret P, Fock E, Mara TA (2006) A node pruning algorithm based on a Fourier amplitude sensitivity test method. IEEE Trans Neural Netw 17(2):273–293CrossRef
Zurück zum Zitat Liu H, Motoda H (1998) Feature selection for knowledge discovery and data mining. Kluwer, BostonMATHCrossRef Liu H, Motoda H (1998) Feature selection for knowledge discovery and data mining. Kluwer, BostonMATHCrossRef
Zurück zum Zitat Luo J, Chen H, Zhang Q, Xu Y, Huang H, Zhao XA (2018) An improved grasshopper optimization algorithm with application to financial stress prediction. Appl Math Model 64:654–668MathSciNetMATHCrossRef Luo J, Chen H, Zhang Q, Xu Y, Huang H, Zhao XA (2018) An improved grasshopper optimization algorithm with application to financial stress prediction. Appl Math Model 64:654–668MathSciNetMATHCrossRef
Zurück zum Zitat Ma L, Khorasani K (2005) Constructive feedforward neural networks using Hermite polynomial activation functions. IEEE Trans Neural Netw 16(4):821–833CrossRef Ma L, Khorasani K (2005) Constructive feedforward neural networks using Hermite polynomial activation functions. IEEE Trans Neural Netw 16(4):821–833CrossRef
Zurück zum Zitat Mafarja MM, Mirjalili S (2017) Hybrid whale optimization algorithm with simulated annealing for feature selection. Neurocomputing 260:302–312CrossRef Mafarja MM, Mirjalili S (2017) Hybrid whale optimization algorithm with simulated annealing for feature selection. Neurocomputing 260:302–312CrossRef
Zurück zum Zitat Mafarja M, Mirjalili S (2018) Whale optimization approaches for wrapper feature selection. Appl Soft Comput 62:441–453CrossRef Mafarja M, Mirjalili S (2018) Whale optimization approaches for wrapper feature selection. Appl Soft Comput 62:441–453CrossRef
Zurück zum Zitat Mafarja M, Aljarah I, Faris H, Hammouri AI, Ala’M AZ, Mirjalili S (2018) Binary grasshopper optimisation algorithm approaches for feature selection problems. Expert Syst Appl 117:267–286CrossRef Mafarja M, Aljarah I, Faris H, Hammouri AI, Ala’M AZ, Mirjalili S (2018) Binary grasshopper optimisation algorithm approaches for feature selection problems. Expert Syst Appl 117:267–286CrossRef
Zurück zum Zitat Mirjalili S, Mirjalili SM, Lewis A (2014b) Grey wolf optimizer. Adv Eng Softw 69:46–61CrossRef Mirjalili S, Mirjalili SM, Lewis A (2014b) Grey wolf optimizer. Adv Eng Softw 69:46–61CrossRef
Zurück zum Zitat Mirjalili SZ, Mirjalili S, Saremi S, Faris H, Aljarah I (2017) Grasshopper optimization algorithm for multi-objective optimization problems. Appl Intell 1–16 Mirjalili SZ, Mirjalili S, Saremi S, Faris H, Aljarah I (2017) Grasshopper optimization algorithm for multi-objective optimization problems. Appl Intell 1–16
Zurück zum Zitat Mirzaei A, Mohsenzadeh Y, Sheikhzadeh H (2017) Variational relevant sample-feature machine: a fully Bayesian approach for embedded feature selection. Neurocomputing 241:181–190CrossRef Mirzaei A, Mohsenzadeh Y, Sheikhzadeh H (2017) Variational relevant sample-feature machine: a fully Bayesian approach for embedded feature selection. Neurocomputing 241:181–190CrossRef
Zurück zum Zitat Orouskhani M, Orouskhani Y, Mansouri M, Teshnehlab M (2013) A novel cat swarm optimization algorithm for unconstrained optimization problems. Inf Technol Comput Sci 5(11):32–41 Orouskhani M, Orouskhani Y, Mansouri M, Teshnehlab M (2013) A novel cat swarm optimization algorithm for unconstrained optimization problems. Inf Technol Comput Sci 5(11):32–41
Zurück zum Zitat Passino KM (2002) Biomimicry of bacterial foraging for distributed optimization and control. IEEE Control Syst Mag 22(3):52–67CrossRef Passino KM (2002) Biomimicry of bacterial foraging for distributed optimization and control. IEEE Control Syst Mag 22(3):52–67CrossRef
Zurück zum Zitat Rezaeianzadeh M, Tabari H, Arabi YA, Isik S, Kalin L (2014) Flood flow forecasting using ANN, ANFIS and regression models. Neural Comput Appl 25(1):25–37CrossRef Rezaeianzadeh M, Tabari H, Arabi YA, Isik S, Kalin L (2014) Flood flow forecasting using ANN, ANFIS and regression models. Neural Comput Appl 25(1):25–37CrossRef
Zurück zum Zitat Saha SK, Ghoshal SP, Kar R, Mandal D (2013) Cat swarm optimization algorithm for optimal linear phase fir filter design. ISA Trans 52:781–794CrossRef Saha SK, Ghoshal SP, Kar R, Mandal D (2013) Cat swarm optimization algorithm for optimal linear phase fir filter design. ISA Trans 52:781–794CrossRef
Zurück zum Zitat Saremi S, Mirjalili S, Lewis A (2017) Grasshopper optimisation algorithm: theory and application. Adv Eng Softw 105:30–47CrossRef Saremi S, Mirjalili S, Lewis A (2017) Grasshopper optimisation algorithm: theory and application. Adv Eng Softw 105:30–47CrossRef
Zurück zum Zitat Schmidhuber J (2015) Deep learning in neural networks: an overview. Neural Netw 61:85–117CrossRef Schmidhuber J (2015) Deep learning in neural networks: an overview. Neural Netw 61:85–117CrossRef
Zurück zum Zitat Tharwat A, Houssein EH, Ahmed MM, Hassanien AE, Gabel T (2017) MOGOA algorithm for constrained and unconstrained multi-objective optimization problems. Appl Intell 48:1–16 Tharwat A, Houssein EH, Ahmed MM, Hassanien AE, Gabel T (2017) MOGOA algorithm for constrained and unconstrained multi-objective optimization problems. Appl Intell 48:1–16
Zurück zum Zitat Tsai JT, Chou JH, Liu TK (2006) Tuning the structure and parameters of a neural network by using hybrid Taguchi-genetic algorithm. IEEE Trans Neural Netw 17(1):69–80CrossRef Tsai JT, Chou JH, Liu TK (2006) Tuning the structure and parameters of a neural network by using hybrid Taguchi-genetic algorithm. IEEE Trans Neural Netw 17(1):69–80CrossRef
Zurück zum Zitat Vázquez JC, López M, Melin P (2010) Real time face identification using a neural network approach. In: Melin P, Kacprzyk J, Pedrycz W (eds) Soft computing for recognition based on biometrics. Studies in computational intelligence, vol 312. Springer, Berlin Vázquez JC, López M, Melin P (2010) Real time face identification using a neural network approach. In: Melin P, Kacprzyk J, Pedrycz W (eds) Soft computing for recognition based on biometrics. Studies in computational intelligence, vol 312. Springer, Berlin
Zurück zum Zitat Wdaa ASI (2008) Differential evolution for neural networks learning enhancement. Ph.D. thesis, Universiti Teknologi, Malaysia Wdaa ASI (2008) Differential evolution for neural networks learning enhancement. Ph.D. thesis, Universiti Teknologi, Malaysia
Zurück zum Zitat Wolpert DH, Macready WG (1997) No free lunch theorems for optimization. IEEE Trans Evol Comput 1(1):67–82CrossRef Wolpert DH, Macready WG (1997) No free lunch theorems for optimization. IEEE Trans Evol Comput 1(1):67–82CrossRef
Zurück zum Zitat Xue B, Zhang M, Browne WN (2013) Particle swarm optimisation for feature selection in classification: novel initialisation and updating mechanisms. Appl Soft Comput 18:261–276CrossRef Xue B, Zhang M, Browne WN (2013) Particle swarm optimisation for feature selection in classification: novel initialisation and updating mechanisms. Appl Soft Comput 18:261–276CrossRef
Zurück zum Zitat Yang XS. (2009) Firefly algorithms for multimodal optimization. In: Watanabe O, Zeugmann T (eds) Stochastic algorithms: foundations and applications. SAGA 2009. Lecture notes in computer science, vol 5792. Springer, Berlin Yang XS. (2009) Firefly algorithms for multimodal optimization. In: Watanabe O, Zeugmann T (eds) Stochastic algorithms: foundations and applications. SAGA 2009. Lecture notes in computer science, vol 5792. Springer, Berlin
Zurück zum Zitat Zanchettin C, Ludermir TB, Almeida LM (2011) Hybrid training method for MLP: optimization of architecture and training. IEEE Trans Syst Man Cybern Part B Cybern 41(4):1097–1109CrossRef Zanchettin C, Ludermir TB, Almeida LM (2011) Hybrid training method for MLP: optimization of architecture and training. IEEE Trans Syst Man Cybern Part B Cybern 41(4):1097–1109CrossRef
Zurück zum Zitat Zarshenas A, Suzuki K (2016) Binary coordinate ascent: an efficient optimization technique for feature subset selection for machine learning. Knowl-Based Syst 110:191–201CrossRef Zarshenas A, Suzuki K (2016) Binary coordinate ascent: an efficient optimization technique for feature subset selection for machine learning. Knowl-Based Syst 110:191–201CrossRef
Zurück zum Zitat Zhang L, Liu L, Yang X-S, Dai Y (2016) A novel hybrid firefly algorithm for global optimization. PLoS ONE 11(9):1–17 Zhang L, Liu L, Yang X-S, Dai Y (2016) A novel hybrid firefly algorithm for global optimization. PLoS ONE 11(9):1–17
Metadaten
Titel
A hybrid grasshopper and new cat swarm optimization algorithm for feature selection and optimization of multi-layer perceptron
verfasst von
Priti Bansal
Sachin Kumar
Sagar Pasrija
Sachin Singh
Publikationsdatum
19.03.2020
Verlag
Springer Berlin Heidelberg
Erschienen in
Soft Computing / Ausgabe 20/2020
Print ISSN: 1432-7643
Elektronische ISSN: 1433-7479
DOI
https://doi.org/10.1007/s00500-020-04877-w

Weitere Artikel der Ausgabe 20/2020

Soft Computing 20/2020 Zur Ausgabe