Skip to main content

1999 | OriginalPaper | Buchkapitel

Continual Prediction using LSTM with Forget Gates

verfasst von : Felix A. Gers, Jürgen Schmidhuber, Fred Cummins

Erschienen in: Neural Nets WIRN Vietri-99

Verlag: Springer London

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Long Short-Term Memory (LSTM,[1]) can solve many tasks not solvable by previous learning algorithms for recurrent neural networks (RNNs). We identify a weakness of LSTM networks processing continual input streams without explicitly marked sequence ends. Without resets, the internal state values may grow indefinitely and eventually cause the network to break down. Our remedy is an adaptive “forget gate” that enables an LSTM cell to learn to reset itself at appropriate times, thus releasing internal resources. We review an illustrative benchmark problem on which standard LSTM outperforms other RNN algorithms. All algorithms (including LSTM) fail to solve a continual version of that problem. LSTM with forget gates, however, easily solves it in an elegant way.

Metadaten
Titel
Continual Prediction using LSTM with Forget Gates
verfasst von
Felix A. Gers
Jürgen Schmidhuber
Fred Cummins
Copyright-Jahr
1999
Verlag
Springer London
DOI
https://doi.org/10.1007/978-1-4471-0877-1_10

Neuer Inhalt