Skip to main content
Erschienen in:
Buchtitelbild

2016 | OriginalPaper | Buchkapitel

Learning Sequential Data with the Help of Linear Systems

verfasst von : Luca Pasa, Alessandro Sperduti

Erschienen in: Artificial Neural Networks in Pattern Recognition

Verlag: Springer International Publishing

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

The aim of the paper is to show that linear dynamical systems can be quite useful when dealing with sequence learning tasks. According to the complexity of the problem to face, linear dynamical systems may directly contribute to provide a good solution at a reduced computational cost, or indirectly provide support at a pre-training stage for nonlinear models. We present and discuss several approaches, both linear and nonlinear, where linear dynamical systems play an important role. These approaches are empirically assessed on two nontrivial datasets of sequences on a prediction task. Experimental results show that indeed linear dynamical systems can either directly provide a satisfactory solution, as well as they may be crucial for the success of more sophisticated nonlinear approaches.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Literatur
1.
Zurück zum Zitat Ashton, K.: That internet of things thing. RFiD J. 22(7), 97–114 (2009) Ashton, K.: That internet of things thing. RFiD J. 22(7), 97–114 (2009)
2.
Zurück zum Zitat Sun, R., Giles, C.L. (eds.): Sequence Learning - Paradigms, Algorithms, and Applications. Springer, London (2001) Sun, R., Giles, C.L. (eds.): Sequence Learning - Paradigms, Algorithms, and Applications. Springer, London (2001)
3.
Zurück zum Zitat Graves, A., Mohamed, A., Hinton, G.: Speech recognition with deep recurrent neural networks. In: IEEE International Conference on Acoustics, Speech and Signal Processing, pp. 6645–6649. IEEE (2013) Graves, A., Mohamed, A., Hinton, G.: Speech recognition with deep recurrent neural networks. In: IEEE International Conference on Acoustics, Speech and Signal Processing, pp. 6645–6649. IEEE (2013)
4.
Zurück zum Zitat Pascanu, R., Gulcehre, C., Cho, K., Bengio, Y.: How to construct deep recurrent neural networks (2013). arXiv preprint arXiv:1312.6026 Pascanu, R., Gulcehre, C., Cho, K., Bengio, Y.: How to construct deep recurrent neural networks (2013). arXiv preprint arXiv:​1312.​6026
5.
Zurück zum Zitat Gregor, K., Danihelka, I., Graves, A., Rezende, D.J., Wierstra, D.: Draw: A recurrent neural network for image generation (2015). arXiv preprint arXiv:1502.04623 Gregor, K., Danihelka, I., Graves, A., Rezende, D.J., Wierstra, D.: Draw: A recurrent neural network for image generation (2015). arXiv preprint arXiv:​1502.​04623
6.
7.
Zurück zum Zitat Jerome, T., Connor, R., Martin, D.: Recurrent neural networks and robust time series prediction. IEEE Trans. Neural Netw. 5(2), 240–254 (1994)CrossRef Jerome, T., Connor, R., Martin, D.: Recurrent neural networks and robust time series prediction. IEEE Trans. Neural Netw. 5(2), 240–254 (1994)CrossRef
8.
Zurück zum Zitat Pasa, L., Sperduti, A.: Pre-training of recurrent neural networks via linear autoencoders. In: Advances in Neural Information Processing Systems, pp. 3572–3580 (2014) Pasa, L., Sperduti, A.: Pre-training of recurrent neural networks via linear autoencoders. In: Advances in Neural Information Processing Systems, pp. 3572–3580 (2014)
9.
Zurück zum Zitat Pasa, L., Testolin, A., Sperduti, A.: Neural networks for sequential data: a pre-training approach based on hidden Markov models. Neurocomputing 169, 323–333 (2015)CrossRef Pasa, L., Testolin, A., Sperduti, A.: Neural networks for sequential data: a pre-training approach based on hidden Markov models. Neurocomputing 169, 323–333 (2015)CrossRef
10.
Zurück zum Zitat Sperduti, A.: Equivalence results between feedforward and recurrent neural networks for sequences. In: Proceedings of the 24th International Conference on Artificial Intelligence, pp. 3827–3833. AAAI Press (2015) Sperduti, A.: Equivalence results between feedforward and recurrent neural networks for sequences. In: Proceedings of the 24th International Conference on Artificial Intelligence, pp. 3827–3833. AAAI Press (2015)
11.
Zurück zum Zitat Sperduti, A.: Exact solutions for recursive principal components analysis of sequences and trees. In: Kollias, S.D., Stafylopatis, A., Duch, W., Oja, E. (eds.) ICANN 2006. LNCS, vol. 4131, pp. 349–356. Springer, Heidelberg (2006)CrossRef Sperduti, A.: Exact solutions for recursive principal components analysis of sequences and trees. In: Kollias, S.D., Stafylopatis, A., Duch, W., Oja, E. (eds.) ICANN 2006. LNCS, vol. 4131, pp. 349–356. Springer, Heidelberg (2006)CrossRef
12.
Zurück zum Zitat Herbert, J.: The echo state approach to analysing and training recurrent neural networks-with an erratum note. Technical report, German National Research Center for Information Technology GMD, Bonn, Germany, 148:34 (2001) Herbert, J.: The echo state approach to analysing and training recurrent neural networks-with an erratum note. Technical report, German National Research Center for Information Technology GMD, Bonn, Germany, 148:34 (2001)
13.
Zurück zum Zitat Boulanger-Lewandowski, N., Bengio, Y., Vincent, P.: Modeling temporal dependencies in high-dimensional sequences: application to polyphonic music generation and transcription. In: ICML (2012) Boulanger-Lewandowski, N., Bengio, Y., Vincent, P.: Modeling temporal dependencies in high-dimensional sequences: application to polyphonic music generation and transcription. In: ICML (2012)
14.
Zurück zum Zitat Bay, M., Ehmann, AF., Downie, J.S.: Evaluation of multiple-F0 estimation and tracking systems. In: ISMIR, pp. 315–320 (2009) Bay, M., Ehmann, AF., Downie, J.S.: Evaluation of multiple-F0 estimation and tracking systems. In: ISMIR, pp. 315–320 (2009)
15.
Zurück zum Zitat Rabiner, L.R.: A tutorial on hidden Markov models and selected applications in speech recognition. Proc. IEEE 77(2), 257–286 (1989)CrossRef Rabiner, L.R.: A tutorial on hidden Markov models and selected applications in speech recognition. Proc. IEEE 77(2), 257–286 (1989)CrossRef
Metadaten
Titel
Learning Sequential Data with the Help of Linear Systems
verfasst von
Luca Pasa
Alessandro Sperduti
Copyright-Jahr
2016
DOI
https://doi.org/10.1007/978-3-319-46182-3_1