Skip to main content
Erschienen in: Soft Computing 17/2019

30.07.2018 | Methodologies and Application

An efficient hybrid multilayer perceptron neural network with grasshopper optimization

Erschienen in: Soft Computing | Ausgabe 17/2019

Einloggen

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

This paper proposes a new hybrid stochastic training algorithm using the recently proposed grasshopper optimization algorithm (GOA) for multilayer perceptrons (MLPs) neural networks. The GOA algorithm is an emerging technique with a high potential in tackling optimization problems based on its flexible and adaptive searching mechanisms. It can demonstrate a satisfactory performance by escaping from local optima and balancing the exploration and exploitation trends. The proposed GOAMLP model is then applied to five important datasets: breast cancer, parkinson, diabetes, coronary heart disease, and orthopedic patients. The results are deeply validated in comparison with eight recent and well-regarded algorithms qualitatively and quantitatively. It is shown and proved that the proposed stochastic training algorithm GOAMLP is substantially beneficial in improving the classification rate of MLPs.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Literatur
Zurück zum Zitat Alba E, Chicano J (2004) Training neural networks with GA hybrid algorithms. In: Genetic and evolutionary computation—GECCO 2004. Springer, pp 852–863 Alba E, Chicano J (2004) Training neural networks with GA hybrid algorithms. In: Genetic and evolutionary computation—GECCO 2004. Springer, pp 852–863
Zurück zum Zitat Aljarah I, Faris H, Mirjalili S (2016) Optimizing connection weights in neural networks using the whale optimization algorithm. Soft Comput 22:1–15CrossRef Aljarah I, Faris H, Mirjalili S (2016) Optimizing connection weights in neural networks using the whale optimization algorithm. Soft Comput 22:1–15CrossRef
Zurück zum Zitat Aljarah I, Al-Zoubi AM, Faris H, Hassonah MA, Mirjalili S, Saadeh H (2018a) Simultaneous feature selection and support vector machine optimization using the grasshopper optimization algorithm. Cognit Comput 10:1–18CrossRef Aljarah I, Al-Zoubi AM, Faris H, Hassonah MA, Mirjalili S, Saadeh H (2018a) Simultaneous feature selection and support vector machine optimization using the grasshopper optimization algorithm. Cognit Comput 10:1–18CrossRef
Zurück zum Zitat Aljarah I, Faris H, Mirjalili S, Al-Madi N (2018b) Training radial basis function networks using biogeography-based optimizer. Neural Comput Appl 29(7):529–553CrossRef Aljarah I, Faris H, Mirjalili S, Al-Madi N (2018b) Training radial basis function networks using biogeography-based optimizer. Neural Comput Appl 29(7):529–553CrossRef
Zurück zum Zitat Almonacid F, Fernandez EF, Mellit A, Kalogirou S (2017) Review of techniques based on artificial neural networks for the electrical characterization of concentrator photovoltaic technology. Renew Sustain Energy Rev 75:938–953CrossRef Almonacid F, Fernandez EF, Mellit A, Kalogirou S (2017) Review of techniques based on artificial neural networks for the electrical characterization of concentrator photovoltaic technology. Renew Sustain Energy Rev 75:938–953CrossRef
Zurück zum Zitat Ata R (2015) Artificial neural networks applications in wind energy systems: a review. Renew Sustain Energy Rev 49:534–562CrossRef Ata R (2015) Artificial neural networks applications in wind energy systems: a review. Renew Sustain Energy Rev 49:534–562CrossRef
Zurück zum Zitat Blum C, Socha K (2005) Training feed-forward neural networks with ant colony optimization: an application to pattern classification. In: Fifth international conference on hybrid intelligent systems, 2005. HIS’05. IEEE, p 6 Blum C, Socha K (2005) Training feed-forward neural networks with ant colony optimization: an application to pattern classification. In: Fifth international conference on hybrid intelligent systems, 2005. HIS’05. IEEE, p 6
Zurück zum Zitat Braik M, Sheta A, Arieqat A (2008) A comparison between GAs and PSO in training ANN to model the TE chemical process reactor. In: AISB 2008 convention communication, interaction and social intelligence, vol 1, p 24 Braik M, Sheta A, Arieqat A (2008) A comparison between GAs and PSO in training ANN to model the TE chemical process reactor. In: AISB 2008 convention communication, interaction and social intelligence, vol 1, p 24
Zurück zum Zitat Chan KY, Ling S-H, Dillon TS, Nguyen HT (2011) Diagnosis of hypoglycemic episodes using a neural network based rule discovery system. Expert Syst Appl 38(8):9799–9808CrossRef Chan KY, Ling S-H, Dillon TS, Nguyen HT (2011) Diagnosis of hypoglycemic episodes using a neural network based rule discovery system. Expert Syst Appl 38(8):9799–9808CrossRef
Zurück zum Zitat Chaudhuri BB, Bhattacharya U (2000) Efficient training and improved performance of multilayer perceptron in pattern classification. Neurocomputing 34(1):11–27MATHCrossRef Chaudhuri BB, Bhattacharya U (2000) Efficient training and improved performance of multilayer perceptron in pattern classification. Neurocomputing 34(1):11–27MATHCrossRef
Zurück zum Zitat Chen J-F, Do QH, Hsieh H-N (2015) Training artificial neural networks by a hybrid PSO-CS algorithm. Algorithms 8(2):292–308MathSciNetMATHCrossRef Chen J-F, Do QH, Hsieh H-N (2015) Training artificial neural networks by a hybrid PSO-CS algorithm. Algorithms 8(2):292–308MathSciNetMATHCrossRef
Zurück zum Zitat Ding S, Li H, Su C, Yu J, Jin F (2013) Evolutionary artificial neural networks: a review. Artif Intell Rev 39(3):251–260CrossRef Ding S, Li H, Su C, Yu J, Jin F (2013) Evolutionary artificial neural networks: a review. Artif Intell Rev 39(3):251–260CrossRef
Zurück zum Zitat Dorigo M, Birattari M, Stutzle T (2006) Ant colony optimization. IEEE Comput Intell Mag 1(4):28–39CrossRef Dorigo M, Birattari M, Stutzle T (2006) Ant colony optimization. IEEE Comput Intell Mag 1(4):28–39CrossRef
Zurück zum Zitat Esteva A, Kuprel B, Novoa RA, Ko J, Swetter SM, Blau HM, Thrun S (2017) Dermatologist-level classification of skin cancer with deep neural networks. Nature 542(7639):115–118CrossRef Esteva A, Kuprel B, Novoa RA, Ko J, Swetter SM, Blau HM, Thrun S (2017) Dermatologist-level classification of skin cancer with deep neural networks. Nature 542(7639):115–118CrossRef
Zurück zum Zitat Faris H, Aljarah I, Al-Madi N, Mirjalili S (2016a) Optimizing the learning process of feedforward neural networks using lightning search algorithm. Int J Artif Intell Tools 25(06):1650033CrossRef Faris H, Aljarah I, Al-Madi N, Mirjalili S (2016a) Optimizing the learning process of feedforward neural networks using lightning search algorithm. Int J Artif Intell Tools 25(06):1650033CrossRef
Zurück zum Zitat Faris H, Aljarah I, Mirjalili S (2016b) Training feedforward neural networks using multi-verse optimizer for binary classification problems. Appl Intell 45(2):322–332CrossRef Faris H, Aljarah I, Mirjalili S (2016b) Training feedforward neural networks using multi-verse optimizer for binary classification problems. Appl Intell 45(2):322–332CrossRef
Zurück zum Zitat Faris H, Mafarja MM, Heidari AA, Aljarah I, Al-Zoubi AM, Mirjalili S, Fujita H (2018a) An efficient binary salp swarm algorithm with crossover scheme for feature selection problems. Knowl Based Syst 154:43–67CrossRef Faris H, Mafarja MM, Heidari AA, Aljarah I, Al-Zoubi AM, Mirjalili S, Fujita H (2018a) An efficient binary salp swarm algorithm with crossover scheme for feature selection problems. Knowl Based Syst 154:43–67CrossRef
Zurück zum Zitat Faris H, Aljarah I, Mirjalili S (2018b) Improved monarch butterfly optimization for unconstrained global search and neural network training. Appl Intell 48(2):445–464CrossRef Faris H, Aljarah I, Mirjalili S (2018b) Improved monarch butterfly optimization for unconstrained global search and neural network training. Appl Intell 48(2):445–464CrossRef
Zurück zum Zitat Goldberg DE (1989) Genetic algorithms in search, optimization and machine learning, 1st edn. Addison-Wesley Longman Publishing Co., Inc, BostonMATH Goldberg DE (1989) Genetic algorithms in search, optimization and machine learning, 1st edn. Addison-Wesley Longman Publishing Co., Inc, BostonMATH
Zurück zum Zitat Gupta JND, Sexton RS (1999) Comparing backpropagation with a genetic algorithm for neural network training. Omega 27(6):679–684CrossRef Gupta JND, Sexton RS (1999) Comparing backpropagation with a genetic algorithm for neural network training. Omega 27(6):679–684CrossRef
Zurück zum Zitat Hamidzadeh J, Moradi M (2018) Improved one-class classification using filled function. Appl Intell 1–17 Hamidzadeh J, Moradi M (2018) Improved one-class classification using filled function. Appl Intell 1–17
Zurück zum Zitat Hamidzadeh J, Namaei N (2018) Belief-based chaotic algorithm for support vector data description. Soft Comput 1–26 Hamidzadeh J, Namaei N (2018) Belief-based chaotic algorithm for support vector data description. Soft Comput 1–26
Zurück zum Zitat Hamidzadeh J, Monsefi R, Yazdi HS (2012) DDC: distance-based decision classifier. Neural Comput Appl 21(7):1697–1707CrossRef Hamidzadeh J, Monsefi R, Yazdi HS (2012) DDC: distance-based decision classifier. Neural Comput Appl 21(7):1697–1707CrossRef
Zurück zum Zitat Hamidzadeh J, Monsefi R, Yazdi HS (2014) LMIRA: large margin instance reduction algorithm. Neurocomputing 145:477–487CrossRef Hamidzadeh J, Monsefi R, Yazdi HS (2014) LMIRA: large margin instance reduction algorithm. Neurocomputing 145:477–487CrossRef
Zurück zum Zitat Hamidzadeh J, Monsefi R, Yazdi HS (2015) IRAHC: instance reduction algorithm using hyperrectangle clustering. Pattern Recognit 48(5):1878–1889MATHCrossRef Hamidzadeh J, Monsefi R, Yazdi HS (2015) IRAHC: instance reduction algorithm using hyperrectangle clustering. Pattern Recognit 48(5):1878–1889MATHCrossRef
Zurück zum Zitat Hamidzadeh J, Monsefi R, Yazdi HS (2016) Large symmetric margin instance selection algorithm. Int J Mach Learn Cybern 7(1):25–45CrossRef Hamidzadeh J, Monsefi R, Yazdi HS (2016) Large symmetric margin instance selection algorithm. Int J Mach Learn Cybern 7(1):25–45CrossRef
Zurück zum Zitat Hamidzadeh J, Sadeghi R, Namaei N (2017) Weighted support vector data description based on chaotic bat algorithm. Appl Soft Comput 60:540–551CrossRef Hamidzadeh J, Sadeghi R, Namaei N (2017) Weighted support vector data description based on chaotic bat algorithm. Appl Soft Comput 60:540–551CrossRef
Zurück zum Zitat Hamidzadeh J, Zabihimayvan M, Sadeghi R (2018) Detection of web site visitors based on fuzzy rough sets. Soft Comput 22(7):2175–2188CrossRef Hamidzadeh J, Zabihimayvan M, Sadeghi R (2018) Detection of web site visitors based on fuzzy rough sets. Soft Comput 22(7):2175–2188CrossRef
Zurück zum Zitat Hansen N, Müller SD, Koumoutsakos P (2003) Reducing the time complexity of the derandomized evolution strategy with covariance matrix adaptation (CMA-ES). Evol Comput 11(1):1–18CrossRef Hansen N, Müller SD, Koumoutsakos P (2003) Reducing the time complexity of the derandomized evolution strategy with covariance matrix adaptation (CMA-ES). Evol Comput 11(1):1–18CrossRef
Zurück zum Zitat Heidari AA, Abbaspour RA (2018) Enhanced chaotic grey wolf optimizer for real-world optimization problems: a comparative study. In: Handbook of research on emergent applications of optimization algorithms. IGI Global, pp 693–727 Heidari AA, Abbaspour RA (2018) Enhanced chaotic grey wolf optimizer for real-world optimization problems: a comparative study. In: Handbook of research on emergent applications of optimization algorithms. IGI Global, pp 693–727
Zurück zum Zitat Heidari AA, Delavar MR (2016) A modified genetic algorithm for finding fuzzy shortest paths in uncertain networks. ISPRS Int Arch Photogramm Remote Sens Spat Inf Sci XLI–B2:299–304CrossRef Heidari AA, Delavar MR (2016) A modified genetic algorithm for finding fuzzy shortest paths in uncertain networks. ISPRS Int Arch Photogramm Remote Sens Spat Inf Sci XLI–B2:299–304CrossRef
Zurück zum Zitat Heidari AA, Pahlavani P (2017) An efficient modified grey wolf optimizer with lévy flight for optimization tasks. Appl Soft Comput 60:115–134CrossRef Heidari AA, Pahlavani P (2017) An efficient modified grey wolf optimizer with lévy flight for optimization tasks. Appl Soft Comput 60:115–134CrossRef
Zurück zum Zitat Heidari AA, Abbaspour RA, Jordehi AR (2017) An efficient chaotic water cycle algorithm for optimization tasks. Neural Comput Appl 28(1):57–85CrossRef Heidari AA, Abbaspour RA, Jordehi AR (2017) An efficient chaotic water cycle algorithm for optimization tasks. Neural Comput Appl 28(1):57–85CrossRef
Zurück zum Zitat Heidari AA, Abbaspour RA, Jordehi AR (2017) Gaussian bare-bones water cycle algorithm for optimal reactive power dispatch in electrical power systems. Appl Soft Comput 57:657–671CrossRef Heidari AA, Abbaspour RA, Jordehi AR (2017) Gaussian bare-bones water cycle algorithm for optimal reactive power dispatch in electrical power systems. Appl Soft Comput 57:657–671CrossRef
Zurück zum Zitat Ilonen J, Kamarainen J-K, Lampinen J (2003) Differential evolution training algorithm for feed-forward neural networks. Neural Process Lett 17(1):93–105CrossRef Ilonen J, Kamarainen J-K, Lampinen J (2003) Differential evolution training algorithm for feed-forward neural networks. Neural Process Lett 17(1):93–105CrossRef
Zurück zum Zitat Islam MM, Yao X, Murase K (2003) A constructive algorithm for training cooperative neural network ensembles. IEEE Trans Neural Netw 14(4):820–834CrossRef Islam MM, Yao X, Murase K (2003) A constructive algorithm for training cooperative neural network ensembles. IEEE Trans Neural Netw 14(4):820–834CrossRef
Zurück zum Zitat Jianbo Y, Wang S, Xi L (2008) Evolving artificial neural networks using an improved PSO and DPSO. Neurocomputing 71(4):1054–1060 Jianbo Y, Wang S, Xi L (2008) Evolving artificial neural networks using an improved PSO and DPSO. Neurocomputing 71(4):1054–1060
Zurück zum Zitat Jordehi AR, Jasni J (2013) Parameter selection in particle swarm optimisation: a survey. J Exp Theor Artif Intell 25(4):527–542CrossRef Jordehi AR, Jasni J (2013) Parameter selection in particle swarm optimisation: a survey. J Exp Theor Artif Intell 25(4):527–542CrossRef
Zurück zum Zitat Karaboga D, Basturk B (2007) A powerful and efficient algorithm for numerical function optimization: artificial bee colony (ABC) algorithm. J Glob Optim 39(3):459–471MathSciNetMATHCrossRef Karaboga D, Basturk B (2007) A powerful and efficient algorithm for numerical function optimization: artificial bee colony (ABC) algorithm. J Glob Optim 39(3):459–471MathSciNetMATHCrossRef
Zurück zum Zitat Karaboga D, Akay B, Ozturk C (2007) Artificial bee colony (ABC) optimization algorithm for training feed-forward neural networks. In: International conference on modeling decisions for artificial intelligence. Springer, pp 318–329 Karaboga D, Akay B, Ozturk C (2007) Artificial bee colony (ABC) optimization algorithm for training feed-forward neural networks. In: International conference on modeling decisions for artificial intelligence. Springer, pp 318–329
Zurück zum Zitat Krogh A (2008) What are artificial neural networks? Nat Biotechnol 26(2):195–197CrossRef Krogh A (2008) What are artificial neural networks? Nat Biotechnol 26(2):195–197CrossRef
Zurück zum Zitat Lee S, Choeh JY (2014) Predicting the helpfulness of online reviews using multilayer perceptron neural networks. Expert Syst Appl 41(6):3041–3046CrossRef Lee S, Choeh JY (2014) Predicting the helpfulness of online reviews using multilayer perceptron neural networks. Expert Syst Appl 41(6):3041–3046CrossRef
Zurück zum Zitat Little MA, McSharry PE, Roberts SJ, Costello DAE, Moroz IM et al (2007) Exploiting nonlinear recurrence and fractal scaling properties for voice disorder detection. Biomed Eng OnLine 6(1):23CrossRef Little MA, McSharry PE, Roberts SJ, Costello DAE, Moroz IM et al (2007) Exploiting nonlinear recurrence and fractal scaling properties for voice disorder detection. Biomed Eng OnLine 6(1):23CrossRef
Zurück zum Zitat Mafarja M, Aljarah I, Heidari AA, Hammouri AI, Faris H, Al-Zoubi AM, Mirjalili S (2018) Evolutionary population dynamics and grasshopper optimization approaches for feature selection problems. Knowl Based Syst 145:25–45CrossRef Mafarja M, Aljarah I, Heidari AA, Hammouri AI, Faris H, Al-Zoubi AM, Mirjalili S (2018) Evolutionary population dynamics and grasshopper optimization approaches for feature selection problems. Knowl Based Syst 145:25–45CrossRef
Zurück zum Zitat Mallipeddi R, Suganthan PN, Pan Q-K, Tasgetiren MF (2011) Differential evolution algorithm with ensemble of parameters and mutation strategies. Appl Soft Comput 11(2):1679–1696CrossRef Mallipeddi R, Suganthan PN, Pan Q-K, Tasgetiren MF (2011) Differential evolution algorithm with ensemble of parameters and mutation strategies. Appl Soft Comput 11(2):1679–1696CrossRef
Zurück zum Zitat Mangasarian OL, Setiono R, Wolberg WH (1990) Pattern recognition via linear programming: theory and application to medical diagnosis. Large Scale Numer Optim 22–31 Mangasarian OL, Setiono R, Wolberg WH (1990) Pattern recognition via linear programming: theory and application to medical diagnosis. Large Scale Numer Optim 22–31
Zurück zum Zitat Mirjalili S (2015) How effective is the grey wolf optimizer in training multi-layer perceptrons. Appl Intell 43(1):150–161CrossRef Mirjalili S (2015) How effective is the grey wolf optimizer in training multi-layer perceptrons. Appl Intell 43(1):150–161CrossRef
Zurück zum Zitat Mirjalili S (2016) SCA: a sine cosine algorithm for solving optimization problems. Knowl Based Syst 96:120–133CrossRef Mirjalili S (2016) SCA: a sine cosine algorithm for solving optimization problems. Knowl Based Syst 96:120–133CrossRef
Zurück zum Zitat Mirjalili S, Lewis A (2016) The whale optimization algorithm. Adv Eng Softw 95:51–67CrossRef Mirjalili S, Lewis A (2016) The whale optimization algorithm. Adv Eng Softw 95:51–67CrossRef
Zurück zum Zitat Mirjalili S, Mirjalili SM, Lewis A (2014) Let a biogeography-based optimizer train your multi-layer perceptron. Inf Sci 269:188–209MathSciNetCrossRef Mirjalili S, Mirjalili SM, Lewis A (2014) Let a biogeography-based optimizer train your multi-layer perceptron. Inf Sci 269:188–209MathSciNetCrossRef
Zurück zum Zitat Mirjalili SZ, Saremi S, Mirjalili SM (2015) Designing evolutionary feedforward neural networks using social spider optimization algorithm. Neural Comput Appl 26(8):1919–1928CrossRef Mirjalili SZ, Saremi S, Mirjalili SM (2015) Designing evolutionary feedforward neural networks using social spider optimization algorithm. Neural Comput Appl 26(8):1919–1928CrossRef
Zurück zum Zitat Moghaddam VH, Hamidzadeh J (2016) New hermite orthogonal polynomial kernel and combined kernels in support vector machine classifier. Pattern Recognit 60:921–935MATHCrossRef Moghaddam VH, Hamidzadeh J (2016) New hermite orthogonal polynomial kernel and combined kernels in support vector machine classifier. Pattern Recognit 60:921–935MATHCrossRef
Zurück zum Zitat Ojha VK, Abraham A, Snášel V (2017) Metaheuristic design of feedforward neural networks: a review of two decades of research. Eng Appl Artif Intell 60:97–116CrossRef Ojha VK, Abraham A, Snášel V (2017) Metaheuristic design of feedforward neural networks: a review of two decades of research. Eng Appl Artif Intell 60:97–116CrossRef
Zurück zum Zitat Sadeghi R, Hamidzadeh J (2018) Automatic support vector data description. Soft Comput 22(1):147–158CrossRef Sadeghi R, Hamidzadeh J (2018) Automatic support vector data description. Soft Comput 22(1):147–158CrossRef
Zurück zum Zitat Saremi S, Mirjalili S, Lewis A (2017) Grasshopper optimisation algorithm: theory and application. Adv Eng Softw 105:30–47CrossRef Saremi S, Mirjalili S, Lewis A (2017) Grasshopper optimisation algorithm: theory and application. Adv Eng Softw 105:30–47CrossRef
Zurück zum Zitat Seiffert U (2001) Multiple layer perceptron training using genetic algorithms. In: Proceedings of the European symposium on artificial neural networks ESANN, Bruges, Blgica Seiffert U (2001) Multiple layer perceptron training using genetic algorithms. In: Proceedings of the European symposium on artificial neural networks ESANN, Bruges, Blgica
Zurück zum Zitat Sexton RS, Gupta JND (2000) Comparative evaluation of genetic algorithm and backpropagation for training neural networks. Inf Sci 129(1):45–59MATHCrossRef Sexton RS, Gupta JND (2000) Comparative evaluation of genetic algorithm and backpropagation for training neural networks. Inf Sci 129(1):45–59MATHCrossRef
Zurück zum Zitat Sexton RS, Dorsey RE, Johnson JD (1999) Optimization of neural networks: a comparative analysis of the genetic algorithm and simulated annealing. Eur J Oper Res 114(3):589–601MATHCrossRef Sexton RS, Dorsey RE, Johnson JD (1999) Optimization of neural networks: a comparative analysis of the genetic algorithm and simulated annealing. Eur J Oper Res 114(3):589–601MATHCrossRef
Zurück zum Zitat Shanker MS (1996) Using neural networks to predict the onset of diabetes mellitus. J Chem Inf Comput Sci 36(1):35–41CrossRef Shanker MS (1996) Using neural networks to predict the onset of diabetes mellitus. J Chem Inf Comput Sci 36(1):35–41CrossRef
Zurück zum Zitat Siddique MNH, Tokhi MO (2001) Training neural networks: backpropagation vs. genetic algorithms. In: International joint conference on neural networks, 2001. Proceedings. IJCNN’01, vol 4. IEEE, pp 2673–2678 Siddique MNH, Tokhi MO (2001) Training neural networks: backpropagation vs. genetic algorithms. In: International joint conference on neural networks, 2001. Proceedings. IJCNN’01, vol 4. IEEE, pp 2673–2678
Zurück zum Zitat Simon D (2008) Biogeography-based optimization. IEEE Trans Evol Comput 12(6):702–713CrossRef Simon D (2008) Biogeography-based optimization. IEEE Trans Evol Comput 12(6):702–713CrossRef
Zurück zum Zitat Slowik A, Bialko M (2008) Training of artificial neural networks using differential evolution algorithm. In: 2008 Conference on human system interactions. IEEE, pp 60–65 Slowik A, Bialko M (2008) Training of artificial neural networks using differential evolution algorithm. In: 2008 Conference on human system interactions. IEEE, pp 60–65
Zurück zum Zitat Socha K, Blum C (2007) An ant colony optimization algorithm for continuous optimization: application to feed-forward neural network training. Neural Comput Appl 16(3):235–247CrossRef Socha K, Blum C (2007) An ant colony optimization algorithm for continuous optimization: application to feed-forward neural network training. Neural Comput Appl 16(3):235–247CrossRef
Zurück zum Zitat Trujillo MCR, Alarcón TE, Dalmau OS, Ojeda AZ (2017) Segmentation of carbon nanotube images through an artificial neural network. Soft Comput 21(3):611–625CrossRef Trujillo MCR, Alarcón TE, Dalmau OS, Ojeda AZ (2017) Segmentation of carbon nanotube images through an artificial neural network. Soft Comput 21(3):611–625CrossRef
Zurück zum Zitat Wang G-G, Deb S, Cui Z (2015) Monarch butterfly optimization. Neural Comput Appl 1–20 Wang G-G, Deb S, Cui Z (2015) Monarch butterfly optimization. Neural Comput Appl 1–20
Zurück zum Zitat Wang L, Zeng Y, Chen T (2015) Back propagation neural network with adaptive differential evolution algorithm for time series forecasting. Expert Syst Appl 42(2):855–863CrossRef Wang L, Zeng Y, Chen T (2015) Back propagation neural network with adaptive differential evolution algorithm for time series forecasting. Expert Syst Appl 42(2):855–863CrossRef
Zurück zum Zitat Wdaa ASI (2008) Differential evolution for neural networks learning enhancement. PhD thesis, Universiti Teknologi Malaysia Wdaa ASI (2008) Differential evolution for neural networks learning enhancement. PhD thesis, Universiti Teknologi Malaysia
Zurück zum Zitat Whitley D, Starkweather T, Bogart C (1990) Genetic algorithms and neural networks: optimizing connections and connectivity. Parallel Comput 14(3):347–361CrossRef Whitley D, Starkweather T, Bogart C (1990) Genetic algorithms and neural networks: optimizing connections and connectivity. Parallel Comput 14(3):347–361CrossRef
Zurück zum Zitat Wienholt W (1993) Minimizing the system error in feedforward neural networks with evolution strategy. In: ICANN93. Springer, pp 490–493 Wienholt W (1993) Minimizing the system error in feedforward neural networks with evolution strategy. In: ICANN93. Springer, pp 490–493
Zurück zum Zitat Wolberg WH, Mangasarian OL (1990) Multisurface method of pattern separation for medical diagnosis applied to breast cytology. Proc Natl Acad Sci 87(23):9193–9196MATHCrossRef Wolberg WH, Mangasarian OL (1990) Multisurface method of pattern separation for medical diagnosis applied to breast cytology. Proc Natl Acad Sci 87(23):9193–9196MATHCrossRef
Zurück zum Zitat Wolpert DH, Macready WG (1997) No free lunch theorems for optimization. IEEE Trans Evol Comput 1(1):67–82CrossRef Wolpert DH, Macready WG (1997) No free lunch theorems for optimization. IEEE Trans Evol Comput 1(1):67–82CrossRef
Zurück zum Zitat Yang X-S, Deb S, Fong S (2011) Accelerated particle swarm optimization and support vector machine for business optimization and applications. In: International conference on networked digital technologies. Springer, pp 53–66 Yang X-S, Deb S, Fong S (2011) Accelerated particle swarm optimization and support vector machine for business optimization and applications. In: International conference on networked digital technologies. Springer, pp 53–66
Zurück zum Zitat Yang X-S (2010) Firefly algorithm, stochastic test functions and design optimisation. Int J Bio Inspired Comput 2(2):78–84CrossRef Yang X-S (2010) Firefly algorithm, stochastic test functions and design optimisation. Int J Bio Inspired Comput 2(2):78–84CrossRef
Zurück zum Zitat Yang X-S, Gandomi AH (2012) Bat algorithm: a novel approach for global engineering optimization. Eng Comput 29(5):464–483CrossRef Yang X-S, Gandomi AH (2012) Bat algorithm: a novel approach for global engineering optimization. Eng Comput 29(5):464–483CrossRef
Zurück zum Zitat Yang X-S, Karamanoglu M, He X (2014) Flower pollination algorithm: a novel approach for multiobjective optimization. Eng Optim 46(9):1222–1237MathSciNetCrossRef Yang X-S, Karamanoglu M, He X (2014) Flower pollination algorithm: a novel approach for multiobjective optimization. Eng Optim 46(9):1222–1237MathSciNetCrossRef
Zurück zum Zitat Yao X, Liu Y (1999) Neural networks for breast cancer diagnosis. In: Proceedings of the 1999 congress on evolutionary computation, 1999. CEC 99, vol 3. IEEE, pp 1760–1767 Yao X, Liu Y (1999) Neural networks for breast cancer diagnosis. In: Proceedings of the 1999 congress on evolutionary computation, 1999. CEC 99, vol 3. IEEE, pp 1760–1767
Zurück zum Zitat Yi-Chung H (2014) Nonadditive similarity-based single-layer perceptron for multi-criteria collaborative filtering. Neurocomputing 129:306–314CrossRef Yi-Chung H (2014) Nonadditive similarity-based single-layer perceptron for multi-criteria collaborative filtering. Neurocomputing 129:306–314CrossRef
Zurück zum Zitat Zhang J-R, Zhang J, Lok T-M, Lyu MR (2007) A hybrid particle swarm optimization–back-propagation algorithm for feedforward neural network training. Appl Math Comput 185(2):1026–1037MATH Zhang J-R, Zhang J, Lok T-M, Lyu MR (2007) A hybrid particle swarm optimization–back-propagation algorithm for feedforward neural network training. Appl Math Comput 185(2):1026–1037MATH
Metadaten
Titel
An efficient hybrid multilayer perceptron neural network with grasshopper optimization
Publikationsdatum
30.07.2018
Erschienen in
Soft Computing / Ausgabe 17/2019
Print ISSN: 1432-7643
Elektronische ISSN: 1433-7479
DOI
https://doi.org/10.1007/s00500-018-3424-2

Weitere Artikel der Ausgabe 17/2019

Soft Computing 17/2019 Zur Ausgabe