Skip to main content

2019 | OriginalPaper | Buchkapitel

11. Hierarchical Temporal Representation in Linear Reservoir Computing

verfasst von : Claudio Gallicchio, Alessio Micheli, Luca Pedrelli

Erschienen in: Neural Advances in Processing Nonlinear Dynamic Signals

Verlag: Springer International Publishing

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

Recently, studies on deep Reservoir Computing (RC) highlighted the role of layering in deep recurrent neural networks (RNNs). In this paper, the use of linear recurrent units allows us to bring more evidence on the intrinsic hierarchical temporal representation in deep RNNs through frequency analysis applied to the state signals. The potentiality of our approach is assessed on the class of Multiple Superimposed Oscillator tasks. Furthermore, our investigation provides useful insights to open a discussion on the main aspects that characterize the deep learning framework in the temporal domain.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Literatur
1.
Zurück zum Zitat Angelov, P., Sperduti, A.: Challenges in deep learning. In: Proceedings of the 24th European Symposium on Artificial Neural Networks (ESANN), pp. 489–495. i6doc.com (2016) Angelov, P., Sperduti, A.: Challenges in deep learning. In: Proceedings of the 24th European Symposium on Artificial Neural Networks (ESANN), pp. 489–495. i6doc.com (2016)
2.
Zurück zum Zitat Čerňanskỳ, M., Tiňo, P.: Predictive modeling with echo state networks. Artif. Neural Netw ICANN 2008, 778–787 (2008) Čerňanskỳ, M., Tiňo, P.: Predictive modeling with echo state networks. Artif. Neural Netw ICANN 2008, 778–787 (2008)
3.
Zurück zum Zitat Frigo, M., Johnson, S.G.: FFTW: An adaptive software architecture for the FFT. In: Proceedings of the 1998 IEEE International Conference on Acoustics, Speech and Signal Processing, vol. 3, pp. 1381–1384. IEEE (1998) Frigo, M., Johnson, S.G.: FFTW: An adaptive software architecture for the FFT. In: Proceedings of the 1998 IEEE International Conference on Acoustics, Speech and Signal Processing, vol. 3, pp. 1381–1384. IEEE (1998)
4.
Zurück zum Zitat Gallicchio, C., Martin-Guerrero, J., Micheli, A., Soria-Olivas, E.: Randomized machine learning approaches: Recent developments and challenges. In: Proceedings of the 25th European Symposium on Artificial Neural Networks (ESANN), pp. 77–86. i6doc.com (2017) Gallicchio, C., Martin-Guerrero, J., Micheli, A., Soria-Olivas, E.: Randomized machine learning approaches: Recent developments and challenges. In: Proceedings of the 25th European Symposium on Artificial Neural Networks (ESANN), pp. 77–86. i6doc.com (2017)
5.
Zurück zum Zitat Gallicchio, C., Micheli, A.: Deep reservoir computing: a critical analysis. In: Proceedings of the 24th European Symposium on Artificial Neural Networks (ESANN), pp. 497–502. i6doc.com (2016) Gallicchio, C., Micheli, A.: Deep reservoir computing: a critical analysis. In: Proceedings of the 24th European Symposium on Artificial Neural Networks (ESANN), pp. 497–502. i6doc.com (2016)
8.
Zurück zum Zitat Gallicchio, C., Micheli, A., Silvestri, L.: Local Lyapunov Exponents of Deep RNN. In: Proceedings of the 25th European Symposium on Artificial Neural Networks (ESANN), pp. 559–564. i6doc.com (2017) Gallicchio, C., Micheli, A., Silvestri, L.: Local Lyapunov Exponents of Deep RNN. In: Proceedings of the 25th European Symposium on Artificial Neural Networks (ESANN), pp. 559–564. i6doc.com (2017)
9.
Zurück zum Zitat Hermans, M., Schrauwen, B.: Training and analysing deep recurrent neural networks. In: NIPS, pp. 190–198 (2013) Hermans, M., Schrauwen, B.: Training and analysing deep recurrent neural networks. In: NIPS, pp. 190–198 (2013)
10.
Zurück zum Zitat Hihi, S.E., Bengio, Y.: Hierarchical recurrent neural networks for long-term dependencies. In: NIPS, pp. 493–499 (1995) Hihi, S.E., Bengio, Y.: Hierarchical recurrent neural networks for long-term dependencies. In: NIPS, pp. 493–499 (1995)
11.
Zurück zum Zitat Holzmann, G., Hauser, H.: Echo state networks with filter neurons and a delay & sum readout. Neural Netw. 23(2), 244–256 (2010)CrossRef Holzmann, G., Hauser, H.: Echo state networks with filter neurons and a delay & sum readout. Neural Netw. 23(2), 244–256 (2010)CrossRef
12.
Zurück zum Zitat Jaeger, H., Haas, H.: Harnessing nonlinearity: predicting chaotic systems and saving energy in wireless communication. Science 304(5667), 78–80 (2004)CrossRef Jaeger, H., Haas, H.: Harnessing nonlinearity: predicting chaotic systems and saving energy in wireless communication. Science 304(5667), 78–80 (2004)CrossRef
13.
Zurück zum Zitat Jaeger, H., Lukoševičius, M., Popovici, D., Siewert, U.: Optimization and applications of echo state networks with leaky-integrator neurons. Neural Netw. 20(3), 335–352 (2007)CrossRef Jaeger, H., Lukoševičius, M., Popovici, D., Siewert, U.: Optimization and applications of echo state networks with leaky-integrator neurons. Neural Netw. 20(3), 335–352 (2007)CrossRef
14.
Zurück zum Zitat Koryakin, D., Lohmann, J., Butz, M.: Balanced echo state networks. Neural Netw. 36, 35–45 (2012)CrossRef Koryakin, D., Lohmann, J., Butz, M.: Balanced echo state networks. Neural Netw. 36, 35–45 (2012)CrossRef
15.
Zurück zum Zitat Lukoševičius, M., Jaeger, H.: Reservoir computing approaches to recurrent neural network training. Comput. Sci. Rev. 3(3), 127–149 (2009)CrossRef Lukoševičius, M., Jaeger, H.: Reservoir computing approaches to recurrent neural network training. Comput. Sci. Rev. 3(3), 127–149 (2009)CrossRef
16.
Zurück zum Zitat Otte, S., Butz, M.V., Koryakin, D., Becker, F., Liwicki, M., Zell, A.: Optimizing recurrent reservoirs with neuro-evolution. Neurocomputing 192, 128–138 (2016)CrossRef Otte, S., Butz, M.V., Koryakin, D., Becker, F., Liwicki, M., Zell, A.: Optimizing recurrent reservoirs with neuro-evolution. Neurocomputing 192, 128–138 (2016)CrossRef
17.
Zurück zum Zitat Pasa, L., Sperduti, A.: Pre-training of recurrent neural networks via linear autoencoders. In: Advances in Neural Information Processing Systems, pp. 3572–3580 (2014) Pasa, L., Sperduti, A.: Pre-training of recurrent neural networks via linear autoencoders. In: Advances in Neural Information Processing Systems, pp. 3572–3580 (2014)
18.
Zurück zum Zitat Pascanu, R., Gülçehre, Ç., Cho, K., Bengio, Y.: How to construct deep recurrent neural networks, pp. 1–13. arXiv preprint arXiv:1312.6026v5 (2014) Pascanu, R., Gülçehre, Ç., Cho, K., Bengio, Y.: How to construct deep recurrent neural networks, pp. 1–13. arXiv preprint arXiv:​1312.​6026v5 (2014)
19.
Zurück zum Zitat Schmidhuber, J.: Deep learning in neural networks: an overview. Neural Netw. 61, 85–117 (2015)CrossRef Schmidhuber, J.: Deep learning in neural networks: an overview. Neural Netw. 61, 85–117 (2015)CrossRef
20.
Zurück zum Zitat Schmidhuber, J., Wierstra, D., Gagliolo, M., Gomez, F.: Training recurrent networks by evolino. Neural Comput. 19(3), 757–779 (2007)CrossRef Schmidhuber, J., Wierstra, D., Gagliolo, M., Gomez, F.: Training recurrent networks by evolino. Neural Comput. 19(3), 757–779 (2007)CrossRef
21.
Zurück zum Zitat Verstraeten, D., Schrauwen, B., d’Haene, M., Stroobandt, D.: An experimental unification of reservoir computing methods. Neural Netw. 20(3), 391–403 (2007)CrossRef Verstraeten, D., Schrauwen, B., d’Haene, M., Stroobandt, D.: An experimental unification of reservoir computing methods. Neural Netw. 20(3), 391–403 (2007)CrossRef
22.
Zurück zum Zitat Wierstra, D., Gomez, F.J., Schmidhuber, J.: Modeling systems with internal state using evolino. In: Proceedings of the 7th Annual Conference on Genetic and Evolutionary Computation, pp. 1795–1802. ACM (2005) Wierstra, D., Gomez, F.J., Schmidhuber, J.: Modeling systems with internal state using evolino. In: Proceedings of the 7th Annual Conference on Genetic and Evolutionary Computation, pp. 1795–1802. ACM (2005)
23.
Zurück zum Zitat Xue, Y., Yang, L., Haykin, S.: Decoupled echo state networks with lateral inhibition. Neural Netw. 20(3), 365–376 (2007)CrossRef Xue, Y., Yang, L., Haykin, S.: Decoupled echo state networks with lateral inhibition. Neural Netw. 20(3), 365–376 (2007)CrossRef
Metadaten
Titel
Hierarchical Temporal Representation in Linear Reservoir Computing
verfasst von
Claudio Gallicchio
Alessio Micheli
Luca Pedrelli
Copyright-Jahr
2019
DOI
https://doi.org/10.1007/978-3-319-95098-3_11