Skip to main content
Top
Published in: Cognitive Computation 3/2017

04-05-2017

Reservoir Computing with Both Neuronal Intrinsic Plasticity and Multi-Clustered Structure

Authors: Fangzheng Xue, Qian Li, Hongjun Zhou, Xiumin Li

Published in: Cognitive Computation | Issue 3/2017

Log in

Activate our intelligent search to find suitable subject content or patents.

search-config
loading …

Abstract

In the echo state networks, both reservoir states and network structure are essential for the performance of reservoir computing. In neuroscience, it has been confirmed that a single neuron can adaptively change its intrinsic excitability to fit various synaptic inputs. This mechanism is called intrinsic plasticity (IP) mechanism in the literature. This adaptive adjustment of neuronal response to external inputs is believed to maximize input-output mutual information. Meanwhile, the existence of multi-clustered structure with small-world-like property in the brain has been strongly supported by many neurophysiological experiments. Thus, it is advisable to consider both the intrinsic plasticity and multi-clustered structure of a reservoir network, rather than a random network with a non-adaptive reservoir response. In this paper, reservoir models with neuronal intrinsic plasticity and multi-clustered structure are investigated. The effects of two types of IP rules on the performance of several computational tasks have been investigated in detail by combining neuronal IP with multi-clustered reservoir structures. The first type is the Triesch’s IP rule, which drives the output activities of neurons to approximate exponential distributions; another is the Li’s IP rule, which generates a Gaussian distribution of neuronal firing. Results show that both the multi-clustered structures and IP rules can improve the computational accuracy of reservoir computing. However, before the application of the IP rules, the enhancement of computational performance for multi-clustered reservoirs is minor. Both IP rules contribute to improvement of the computational performance, where the Li’s IP rule is more advantageous than the Triesch’s IP. The results indicate that the combination of multi-clustered reservoir structures and IP learning can increase the dynamic diversity of reservoir states, especially for the IP’s learning. The adaptive tuning of reservoir states based on IP improves the dynamic complexity of neuronal activity, which helps train output weights. This biologically inspired reservoir model may give insights for the optimization of reservoir computing.

Dont have a licence yet? Then find out more about our products and how to get one now:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Literature
1.
go back to reference Jaeger H. A tutorial on training recurrent neural networks, covering BPTT, RURL, EKF and the echo state network approach, Journal, Technical Report GMD Report 159, German National Research Center for Information Technology. 2002. Jaeger H. A tutorial on training recurrent neural networks, covering BPTT, RURL, EKF and the echo state network approach, Journal, Technical Report GMD Report 159, German National Research Center for Information Technology. 2002.
2.
go back to reference Jaeger H, Hass H. Harnessing nonlinearity: predicting chaotic systems and saving energy in wireless telecommunication. Science 2004;5667:78–80.CrossRef Jaeger H, Hass H. Harnessing nonlinearity: predicting chaotic systems and saving energy in wireless telecommunication. Science 2004;5667:78–80.CrossRef
3.
go back to reference Li D, Han M, Wang J. Chaotic time series prediction based on a novel robust echo state network. IEEE Trans Neural Netw Learn Syst 2012;23(5):787–99.CrossRefPubMed Li D, Han M, Wang J. Chaotic time series prediction based on a novel robust echo state network. IEEE Trans Neural Netw Learn Syst 2012;23(5):787–99.CrossRefPubMed
4.
go back to reference Jaeger H. Adaptive nonlinear system identification with echo state networks. Advances in neural information processing systems; 2004. p. 78–80. Jaeger H. Adaptive nonlinear system identification with echo state networks. Advances in neural information processing systems; 2004. p. 78–80.
5.
go back to reference Skowronski M D, Harris JG. Noise-robust automatic speech recognition using a predictive echo state network. IEEE Trans Audio Speech Lang Process 2007;15(5):1724–1730.CrossRef Skowronski M D, Harris JG. Noise-robust automatic speech recognition using a predictive echo state network. IEEE Trans Audio Speech Lang Process 2007;15(5):1724–1730.CrossRef
6.
go back to reference Skowronski M D, Harris JG. Minimum mean squared error time series classification using an echo state network prediction model. In: IEEE International symposium on circuit system; 2006. p. 3153–3156. Skowronski M D, Harris JG. Minimum mean squared error time series classification using an echo state network prediction model. In: IEEE International symposium on circuit system; 2006. p. 3153–3156.
7.
go back to reference Scardapane S, Uncini A. Semi-supervised echo state networks for audio classification. Cogn Comput.; 2016. p. 1–11. Scardapane S, Uncini A. Semi-supervised echo state networks for audio classification. Cogn Comput.; 2016. p. 1–11.
8.
go back to reference Lin X, Yang Z, Song Y. Short-term stock price prediction based on echo state networks. Expert Syst Appl 2009;36(3):7313– 17.CrossRef Lin X, Yang Z, Song Y. Short-term stock price prediction based on echo state networks. Expert Syst Appl 2009;36(3):7313– 17.CrossRef
9.
go back to reference Meftah B, Lzoray O, Benyettou A. A novel approach using echo state networks for microscopic cellular image segmentation. Cogn Comput 2016;8(2):1–9.CrossRef Meftah B, Lzoray O, Benyettou A. A novel approach using echo state networks for microscopic cellular image segmentation. Cogn Comput 2016;8(2):1–9.CrossRef
10.
go back to reference Tong M H, Bicket A D, Christiansen E M, Cottrell GW. Clustered complex echo state networks for traffic forecasting with prior knowledge, instrumentation and measurement technology conference, I2MTC; 2007. p. 1–5. Tong M H, Bicket A D, Christiansen E M, Cottrell GW. Clustered complex echo state networks for traffic forecasting with prior knowledge, instrumentation and measurement technology conference, I2MTC; 2007. p. 1–5.
11.
go back to reference Jaeger H, Lukoeviius M, Popovici D, Siewert U. Optimization and applications of echo state networks with leaky-integrator neurons. Neural Netw 2007;20(3):335–52.CrossRefPubMed Jaeger H, Lukoeviius M, Popovici D, Siewert U. Optimization and applications of echo state networks with leaky-integrator neurons. Neural Netw 2007;20(3):335–52.CrossRefPubMed
12.
go back to reference Najibi E, Rostami H. SCESN, SPESN, SWESN: three recurrent neural echo state networks with clustered reservoirs for prediction of nonlinear and chaotic time series. Appl Intell 2015;43(2):460–72.CrossRef Najibi E, Rostami H. SCESN, SPESN, SWESN: three recurrent neural echo state networks with clustered reservoirs for prediction of nonlinear and chaotic time series. Appl Intell 2015;43(2):460–72.CrossRef
13.
go back to reference Liebald B. Exploration of effects of different network topologies on the ESN signal crosscorrelation matrix spectrum. International University Bremen. 2004. Liebald B. Exploration of effects of different network topologies on the ESN signal crosscorrelation matrix spectrum. International University Bremen. 2004.
14.
go back to reference Gallicchio C, Micheli A. Tree echo state networks. Neurocomputing 2013;101(3):319–37.CrossRef Gallicchio C, Micheli A. Tree echo state networks. Neurocomputing 2013;101(3):319–37.CrossRef
15.
go back to reference Song Q S, Feng ZR. Effects of connectivity structure of complex echo state network on its prediction performance for nonlinear time series. Neurocomputing 2010;73:2177–85.CrossRef Song Q S, Feng ZR. Effects of connectivity structure of complex echo state network on its prediction performance for nonlinear time series. Neurocomputing 2010;73:2177–85.CrossRef
16.
go back to reference Gao Z K, Jin ND. A directed weighted complex network for characterizing chaotic dynamics from time series. Nonlinear Anal Real World Appl 2012;13:947–52.CrossRef Gao Z K, Jin ND. A directed weighted complex network for characterizing chaotic dynamics from time series. Nonlinear Anal Real World Appl 2012;13:947–52.CrossRef
17.
go back to reference Gao Z K, Jin ND. Complex network from time series based on phase space reconstruction. Chaos Interdisc J Nonlinear Sci 2009;19(3):375–93. Gao Z K, Jin ND. Complex network from time series based on phase space reconstruction. Chaos Interdisc J Nonlinear Sci 2009;19(3):375–93.
18.
go back to reference Deng Z D, Zhang Y. Collective behavior of a small-world recurrent neural system with scale-free distribution. IEEE Trans Neural Netw 2006;18(5):1364–75.CrossRef Deng Z D, Zhang Y. Collective behavior of a small-world recurrent neural system with scale-free distribution. IEEE Trans Neural Netw 2006;18(5):1364–75.CrossRef
19.
go back to reference Ma Q L, Chen WB. Modular state space of echo state network. Neurocomputing 2013;122(122):406–17.CrossRef Ma Q L, Chen WB. Modular state space of echo state network. Neurocomputing 2013;122(122):406–17.CrossRef
20.
go back to reference Deng Z D, Zhang Y. Collective behavior of a small-world recurrent neural system with scale-free distribution. IEEE Trans Neural Netw 2012;18(5):1364–75.CrossRef Deng Z D, Zhang Y. Collective behavior of a small-world recurrent neural system with scale-free distribution. IEEE Trans Neural Netw 2012;18(5):1364–75.CrossRef
21.
go back to reference Yang B, Deng ZD. An extended SHESN with leaky integrator neuron and inhibitory connection for Mackey-Glass prediction. Front Electr Electron Eng 2012;7(2):200–07. Yang B, Deng ZD. An extended SHESN with leaky integrator neuron and inhibitory connection for Mackey-Glass prediction. Front Electr Electron Eng 2012;7(2):200–07.
22.
go back to reference Song Q, Feng Z. Effects of connectivity structure of complex echo state network on its prediction performance for nonlinear time series. Neurocomputing 2010;73(10–12):2177–85.CrossRef Song Q, Feng Z. Effects of connectivity structure of complex echo state network on its prediction performance for nonlinear time series. Neurocomputing 2010;73(10–12):2177–85.CrossRef
23.
24.
go back to reference Turrigiano G G, Nelson SB. Homeostatic plasticity in the developing nervous system. Nat Rev Neurosci 2004;5 (2):97–107.CrossRefPubMed Turrigiano G G, Nelson SB. Homeostatic plasticity in the developing nervous system. Nat Rev Neurosci 2004;5 (2):97–107.CrossRefPubMed
25.
go back to reference Kourrich S, Calu D J, Bonci A. Intrinsic plasticity: an emerging player in addiction. Nat Rev Neurosci 2015;16(3):173–84.CrossRefPubMed Kourrich S, Calu D J, Bonci A. Intrinsic plasticity: an emerging player in addiction. Nat Rev Neurosci 2015;16(3):173–84.CrossRefPubMed
26.
go back to reference Watt A J, Han NS. Homeostatic plasticity and STDP: keeping a neuron’s cool in a fluctuating world. Front Synaptic Neurosci. 2010;2(5). Watt A J, Han NS. Homeostatic plasticity and STDP: keeping a neuron’s cool in a fluctuating world. Front Synaptic Neurosci. 2010;2(5).
27.
go back to reference Triesch J. A gradient rule for the plasticity of a neuron’s intrinsic excitability. Artificial Neural Networks. Biological inspirations — ICANN; 2005. p. 65–70. Triesch J. A gradient rule for the plasticity of a neuron’s intrinsic excitability. Artificial Neural Networks. Biological inspirations — ICANN; 2005. p. 65–70.
28.
go back to reference Steil J J. Online reservoir adaptation by intrinsic plasticity for backprogation-decorrelation and echo state learning. Neural Netw Off J Int Neural Netw Soc 2007;20(3):353–64.CrossRef Steil J J. Online reservoir adaptation by intrinsic plasticity for backprogation-decorrelation and echo state learning. Neural Netw Off J Int Neural Netw Soc 2007;20(3):353–64.CrossRef
29.
go back to reference Li C G. A model of neuronal intrinsic plasticity. IEEE Trans Auton Ment Dev 2011;3(4):277–84.CrossRef Li C G. A model of neuronal intrinsic plasticity. IEEE Trans Auton Ment Dev 2011;3(4):277–84.CrossRef
30.
go back to reference Koprinkova-Hristova P. On effects of IP improvement of ESN reservoirs for reflecting of data structure. In: International joint conference on neural networks. IEEE; 2015. p. 1–7. Koprinkova-Hristova P. On effects of IP improvement of ESN reservoirs for reflecting of data structure. In: International joint conference on neural networks. IEEE; 2015. p. 1–7.
31.
go back to reference Koprinkovahristova P. On-line training of ESN and IP tuning effect. Lect Notes Comput Sci 2014;8681:25–32.CrossRef Koprinkovahristova P. On-line training of ESN and IP tuning effect. Lect Notes Comput Sci 2014;8681:25–32.CrossRef
32.
go back to reference Nisbach F, Kaiser M. Developmental time windows for spatial growth generate multiple-cluster small-world networks. Eur Phys J 2007;23:185–91.CrossRef Nisbach F, Kaiser M. Developmental time windows for spatial growth generate multiple-cluster small-world networks. Eur Phys J 2007;23:185–91.CrossRef
33.
go back to reference Maass M. Lower bounds for the computational power of networks of spiking neurons. Neural Computing; 1996. p. 1–40. Maass M. Lower bounds for the computational power of networks of spiking neurons. Neural Computing; 1996. p. 1–40.
34.
go back to reference Baddeley R, Abbott L F, Booth M C, Sengpiel F, Freeman T. Responses of neurons in primary and inferior temporal visual cortices to natural scenes. Biol Sci 2014;264:1775–83.CrossRef Baddeley R, Abbott L F, Booth M C, Sengpiel F, Freeman T. Responses of neurons in primary and inferior temporal visual cortices to natural scenes. Biol Sci 2014;264:1775–83.CrossRef
35.
go back to reference Jaeger H. Reservoir riddles: suggestions for echo state network research: In: Proceedings IEEE international joint conference on neural networks, IJCNN’05. 2005, vol 3. p. 1460–1462. Jaeger H. Reservoir riddles: suggestions for echo state network research: In: Proceedings IEEE international joint conference on neural networks, IJCNN’05. 2005, vol 3. p. 1460–1462.
36.
go back to reference Schrauwen B, Wardermann M, Verstraeten D, et al. Improving reservoirs using intrinsic plasticity 2008; 71(7–9):1159–71. Schrauwen B, Wardermann M, Verstraeten D, et al. Improving reservoirs using intrinsic plasticity 2008; 71(7–9):1159–71.
Metadata
Title
Reservoir Computing with Both Neuronal Intrinsic Plasticity and Multi-Clustered Structure
Authors
Fangzheng Xue
Qian Li
Hongjun Zhou
Xiumin Li
Publication date
04-05-2017
Publisher
Springer US
Published in
Cognitive Computation / Issue 3/2017
Print ISSN: 1866-9956
Electronic ISSN: 1866-9964
DOI
https://doi.org/10.1007/s12559-017-9467-3

Other articles of this Issue 3/2017

Cognitive Computation 3/2017 Go to the issue

Premium Partner