Skip to main content
Erschienen in: Neural Computing and Applications 14/2020

23.11.2019 | Original Article

Swarm intelligence based approach for efficient training of regressive neural networks

Erschienen in: Neural Computing and Applications | Ausgabe 14/2020

Einloggen

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

This work proposes an efficient approach to solve the problem of training a regressive neural network efficiently. Regressive networks are characterized by delay lines possibly in both the input and the output feedback. Each delay line is connected to the network with synaptic weights and thus increases the number of parameters that must be optimized by the training algorithm. Training algorithms such as the Levenberg–Marquardt, normally used to train neural networks, are prone to local minima entrapment, and for this reason, a strategy to initialize the training procedure correctly is needed. To solve this problem, the continuous flock of starling optimization algorithm, a highly explorative optimizer based on swarm intelligence, is used. The proposed approach is tested and validated on an experimental benchmark featuring a second-order nonlinear dynamic system.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Anhänge
Nur mit Berechtigung zugänglich
Literatur
1.
Zurück zum Zitat Cybenko G (1989) Approximation by superpositions of a sigmoidal function. Math Control Signals Syst (MCSS) 2(4):303–314MathSciNetMATH Cybenko G (1989) Approximation by superpositions of a sigmoidal function. Math Control Signals Syst (MCSS) 2(4):303–314MathSciNetMATH
2.
Zurück zum Zitat Hornik K, Stinchcombe M, White H (1989) Multilayer feedforward networks are universal approximators. Neural Netw 2(5):359–366MATH Hornik K, Stinchcombe M, White H (1989) Multilayer feedforward networks are universal approximators. Neural Netw 2(5):359–366MATH
3.
Zurück zum Zitat Whalen P, Brio M, Moloney JV (2015) Exponential time-differencing with embedded Runge–Kutta adaptive step control. J Comput Phys 280:579–601MathSciNetMATH Whalen P, Brio M, Moloney JV (2015) Exponential time-differencing with embedded Runge–Kutta adaptive step control. J Comput Phys 280:579–601MathSciNetMATH
4.
Zurück zum Zitat Butcher JC (2016) Numerical methods for ordinary differential equations. Wiley, New YorkMATH Butcher JC (2016) Numerical methods for ordinary differential equations. Wiley, New YorkMATH
5.
Zurück zum Zitat Khan K, Sahai A (2012) A comparison of BA, GA, PSO, BP and LM for training feed forward neural networks in e-learning context. Int J Intell Syst Appl 4(7):23 Khan K, Sahai A (2012) A comparison of BA, GA, PSO, BP and LM for training feed forward neural networks in e-learning context. Int J Intell Syst Appl 4(7):23
6.
Zurück zum Zitat De Jesús Rubio J, Angelov P, Pacheco J (2011) Uniformly stable backpropagation algorithm to train a feedforward neural network. IEEE Trans Neural Netw 22(3):356–366 De Jesús Rubio J, Angelov P, Pacheco J (2011) Uniformly stable backpropagation algorithm to train a feedforward neural network. IEEE Trans Neural Netw 22(3):356–366
7.
Zurück zum Zitat Williams RJ, Zipser D (1989) Experimental analysis of the real-time recurrent learning algorithm. Connect Sci 1(1):87–111 Williams RJ, Zipser D (1989) Experimental analysis of the real-time recurrent learning algorithm. Connect Sci 1(1):87–111
8.
Zurück zum Zitat Steil JJ (2004) Backpropagation-decorrelation: online recurrent learning with O (N) complexity. In: 2004 IEEE international joint conference on neural networks. Proceedings, vol 2. IEEE, pp 843–848 Steil JJ (2004) Backpropagation-decorrelation: online recurrent learning with O (N) complexity. In: 2004 IEEE international joint conference on neural networks. Proceedings, vol 2. IEEE, pp 843–848
9.
Zurück zum Zitat Williams RJ, Zipser D (1989) A learning algorithm for continually running fully recurrent neural networks. Neural Comput 1(2):270–280 Williams RJ, Zipser D (1989) A learning algorithm for continually running fully recurrent neural networks. Neural Comput 1(2):270–280
10.
Zurück zum Zitat Laudani A, Lozito GM, Riganti Fulginei F, Salvini A (2015) On training efficiency and computational costs of a feed forward neural network: a review. Comput Intell Neurosci 2015:83 Laudani A, Lozito GM, Riganti Fulginei F, Salvini A (2015) On training efficiency and computational costs of a feed forward neural network: a review. Comput Intell Neurosci 2015:83
11.
Zurück zum Zitat Youssef A, Mohammed K, Nasar A (2012) A reconfigurable, generic and programmable feed forward neural network implementation in FPGA. In: 2012 UKSim 14th international conference on computer modelling and simulation (UKSim). IEEE, pp 9–13 Youssef A, Mohammed K, Nasar A (2012) A reconfigurable, generic and programmable feed forward neural network implementation in FPGA. In: 2012 UKSim 14th international conference on computer modelling and simulation (UKSim). IEEE, pp 9–13
12.
Zurück zum Zitat Hariprasath S, Prabakar TN (2012) FPGA implementation of multilayer feed forward neural network architecture using VHDL. In: 2012 international conference on computing, communication and applications (ICCCA). IEEE, pp 1–6 Hariprasath S, Prabakar TN (2012) FPGA implementation of multilayer feed forward neural network architecture using VHDL. In: 2012 international conference on computing, communication and applications (ICCCA). IEEE, pp 1–6
13.
Zurück zum Zitat Laudani A, Lozito GM, Fulginei FR, Salvini A (2014) An efficient architecture for floating point based MISO neural neworks on FPGA. In: 2014 UKSim-AMSS 16th international conference on computer modelling and simulation (UKSim). IEEE, pp 12–17 Laudani A, Lozito GM, Fulginei FR, Salvini A (2014) An efficient architecture for floating point based MISO neural neworks on FPGA. In: 2014 UKSim-AMSS 16th international conference on computer modelling and simulation (UKSim). IEEE, pp 12–17
14.
Zurück zum Zitat Cardelli E, Faba A, Laudani A, Lozito GM, Fulginei FR, Salvini A (2016) Two-dimensional magnetic modeling of ferromagnetic materials by using a neural networks based hybrid approach. Physica B 486:106–110 Cardelli E, Faba A, Laudani A, Lozito GM, Fulginei FR, Salvini A (2016) Two-dimensional magnetic modeling of ferromagnetic materials by using a neural networks based hybrid approach. Physica B 486:106–110
15.
Zurück zum Zitat Cardelli E, Faba A, Laudani A, Lozito GM, Fulginei FR, Salvini A (2016) A neural-FEM tool for the 2-D magnetic hysteresis modeling. Physica B 486:111–115 Cardelli E, Faba A, Laudani A, Lozito GM, Fulginei FR, Salvini A (2016) A neural-FEM tool for the 2-D magnetic hysteresis modeling. Physica B 486:111–115
16.
Zurück zum Zitat Ilonen J, Kamarainen JK, Lampinen J (2003) Differential evolution training algorithm for feed-forward neural networks. Neural Process Lett 17(1):93–105 Ilonen J, Kamarainen JK, Lampinen J (2003) Differential evolution training algorithm for feed-forward neural networks. Neural Process Lett 17(1):93–105
17.
Zurück zum Zitat Leung FHF, Lam HK, Ling SH, Tam PKS (2003) Tuning of the structure and parameters of a neural network using an improved genetic algorithm. IEEE Trans Neural Netw 14(1):79–88 Leung FHF, Lam HK, Ling SH, Tam PKS (2003) Tuning of the structure and parameters of a neural network using an improved genetic algorithm. IEEE Trans Neural Netw 14(1):79–88
18.
Zurück zum Zitat Karaboga D, Akay B, Ozturk C (2007) Artificial bee colony (ABC) optimization algorithm for training feed-forward neural networks. MDAI 7:318–319 Karaboga D, Akay B, Ozturk C (2007) Artificial bee colony (ABC) optimization algorithm for training feed-forward neural networks. MDAI 7:318–319
19.
Zurück zum Zitat Zhang JR, Zhang J, Lok TM, Lyu MR (2007) A hybrid particle swarm optimization–back-propagation algorithm for feedforward neural network training. Appl Math Comput 185(2):1026–1037MATH Zhang JR, Zhang J, Lok TM, Lyu MR (2007) A hybrid particle swarm optimization–back-propagation algorithm for feedforward neural network training. Appl Math Comput 185(2):1026–1037MATH
20.
Zurück zum Zitat Valian E, Mohanna S, Tavakoli S (2011) Improved cuckoo search algorithm for feedforward neural network training. Int J Artif Intell Appl 2(3):36–43 Valian E, Mohanna S, Tavakoli S (2011) Improved cuckoo search algorithm for feedforward neural network training. Int J Artif Intell Appl 2(3):36–43
21.
Zurück zum Zitat Laudani A, Fulginei FR, Salvini A (2013) Closed forms for the fully-connected continuous flock of starlings optimization algorithm. In: 2013 UKSim 15th international conference on computer modelling and simulation (UKSim). IEEE, pp 45–50 Laudani A, Fulginei FR, Salvini A (2013) Closed forms for the fully-connected continuous flock of starlings optimization algorithm. In: 2013 UKSim 15th international conference on computer modelling and simulation (UKSim). IEEE, pp 45–50
23.
Zurück zum Zitat Laudani A, Fulginei FR, Lozito GM, Salvini A (2014) Swarm/flock optimization algorithms as continuous dynamic systems. Appl Math Comput 243:670–683MathSciNetMATH Laudani A, Fulginei FR, Lozito GM, Salvini A (2014) Swarm/flock optimization algorithms as continuous dynamic systems. Appl Math Comput 243:670–683MathSciNetMATH
27.
Zurück zum Zitat Coco S, Laudani A, Lozito GM, Pollicino G (2018) Effective permeability estimation of a composite magnetic shielding mortar by using swarm intelligence. Int J Appl Electromagn Mech 1–12 (Preprint) Coco S, Laudani A, Lozito GM, Pollicino G (2018) Effective permeability estimation of a composite magnetic shielding mortar by using swarm intelligence. Int J Appl Electromagn Mech 1–12 (Preprint)
28.
Zurück zum Zitat Laudani A, Fulginei FR, Salvini A (2015) TMS array coils optimization by means of CFSO. IEEE Trans Magn 51(3):1–4 Laudani A, Fulginei FR, Salvini A (2015) TMS array coils optimization by means of CFSO. IEEE Trans Magn 51(3):1–4
29.
Zurück zum Zitat Laudani A, Lozito GM, Fulginei FR, Salvini A (2016) FEM model identification for a vector hysteresis workbench. In: 2016 IEEE 2nd international forum on research and technologies for society and industry leveraging a better tomorrow (RTSI). IEEE, pp 1–5 Laudani A, Lozito GM, Fulginei FR, Salvini A (2016) FEM model identification for a vector hysteresis workbench. In: 2016 IEEE 2nd international forum on research and technologies for society and industry leveraging a better tomorrow (RTSI). IEEE, pp 1–5
30.
Zurück zum Zitat Coco S, Laudani A, Fulginei FR, Salvini A (2012) Accurate design of Helmholtz coils for ELF Bioelectromagnetic interaction by means of continuous FSO. Int J Appl Electromagn Mech 39(1–4):651–656 Coco S, Laudani A, Fulginei FR, Salvini A (2012) Accurate design of Helmholtz coils for ELF Bioelectromagnetic interaction by means of continuous FSO. Int J Appl Electromagn Mech 39(1–4):651–656
31.
Zurück zum Zitat Piotrowski AP, Napiorkowski JJ (2011) Optimizing neural networks for river flow forecasting–evolutionary computation methods versus the Levenberg–Marquardt approach. J Hydrol 407(1–4):12–27 Piotrowski AP, Napiorkowski JJ (2011) Optimizing neural networks for river flow forecasting–evolutionary computation methods versus the Levenberg–Marquardt approach. J Hydrol 407(1–4):12–27
32.
Zurück zum Zitat Wolpert DH, Macready WG (1997) No free lunch theorems for optimization. IEEE Trans Evol Comput 1(1):67–82 Wolpert DH, Macready WG (1997) No free lunch theorems for optimization. IEEE Trans Evol Comput 1(1):67–82
33.
Zurück zum Zitat Ceperic V, Baric A (2014) Reducing complexity of echo state networks with sparse linear regression algorithms. In: 2014 UKSim-AMSS 16th international conference on computer modelling and simulation (UKSim), pp 26–31 Ceperic V, Baric A (2014) Reducing complexity of echo state networks with sparse linear regression algorithms. In: 2014 UKSim-AMSS 16th international conference on computer modelling and simulation (UKSim), pp 26–31
34.
Zurück zum Zitat Jaeger H (2003) Adaptive nonlinear system identification with echo state networks. In: Advances in neural information processing systems, pp 609–616. ISBN: 0262025507;978-026202550-8 Jaeger H (2003) Adaptive nonlinear system identification with echo state networks. In: Advances in neural information processing systems, pp 609–616. ISBN: 0262025507;978-026202550-8
35.
Zurück zum Zitat Atiya Amir F, Parlos Alexander G (2000) New results on recurrent network training: unifying the algorithms and accelerating convergence. IEEE Trans Neural Netw 11(3):697–709 Atiya Amir F, Parlos Alexander G (2000) New results on recurrent network training: unifying the algorithms and accelerating convergence. IEEE Trans Neural Netw 11(3):697–709
36.
Zurück zum Zitat Verstraeten D, Schrauwen B, d’Haene M, Stroobandt D (2007) An experimental unification of reservoir computing methods. Neural Netw 20(3):391–403MATH Verstraeten D, Schrauwen B, d’Haene M, Stroobandt D (2007) An experimental unification of reservoir computing methods. Neural Netw 20(3):391–403MATH
37.
Zurück zum Zitat Rodan A, Tino P (2011) Minimum complexity echo state network. IEEE Trans Neural Netw 22(1):131–144 Rodan A, Tino P (2011) Minimum complexity echo state network. IEEE Trans Neural Netw 22(1):131–144
38.
Zurück zum Zitat Gensler A, Henze J, Sick B, Raabe N (2016) Deep learning for solar power forecasting—an approach using AutoEncoder and LSTM neural networks. In: 2016 IEEE international conference on systems, man, and cybernetics (SMC). IEEE, pp 002858–002865 Gensler A, Henze J, Sick B, Raabe N (2016) Deep learning for solar power forecasting—an approach using AutoEncoder and LSTM neural networks. In: 2016 IEEE international conference on systems, man, and cybernetics (SMC). IEEE, pp 002858–002865
40.
Zurück zum Zitat Łukasik S, Kowalski PA (2014) Fully informed swarm optimization algorithms: basic concepts, variants and experimental evaluation. In: 2014 Federated conference on computer science and information systems (FedCSIS). IEEE, pp 155–61 Łukasik S, Kowalski PA (2014) Fully informed swarm optimization algorithms: basic concepts, variants and experimental evaluation. In: 2014 Federated conference on computer science and information systems (FedCSIS). IEEE, pp 155–61
41.
Zurück zum Zitat Wang S, Phillips P, Yang J, Sun P, Zhang Y (2016) Magnetic resonance brain classification by a novel binary particle swarm optimization with mutation and time-varying acceleration coefficients. Biomed Eng (Biomedizinische Technik) 61(4):431–441 Wang S, Phillips P, Yang J, Sun P, Zhang Y (2016) Magnetic resonance brain classification by a novel binary particle swarm optimization with mutation and time-varying acceleration coefficients. Biomed Eng (Biomedizinische Technik) 61(4):431–441
42.
Zurück zum Zitat Zhang Y, Wang S, Phillips P, Dong Z, Ji G, Yang J (2015) Detection of Alzheimer’s disease and mild cognitive impairment based on structural volumetric MR images using 3D-DWT and WTA-KSVM trained by PSOTVAC. Biomed Signal Process Control 21:58–73 Zhang Y, Wang S, Phillips P, Dong Z, Ji G, Yang J (2015) Detection of Alzheimer’s disease and mild cognitive impairment based on structural volumetric MR images using 3D-DWT and WTA-KSVM trained by PSOTVAC. Biomed Signal Process Control 21:58–73
Metadaten
Titel
Swarm intelligence based approach for efficient training of regressive neural networks
Publikationsdatum
23.11.2019
Erschienen in
Neural Computing and Applications / Ausgabe 14/2020
Print ISSN: 0941-0643
Elektronische ISSN: 1433-3058
DOI
https://doi.org/10.1007/s00521-019-04606-x

Weitere Artikel der Ausgabe 14/2020

Neural Computing and Applications 14/2020 Zur Ausgabe