2012 | OriginalPaper | Buchkapitel
Elimination of a Catastrophic Destruction of a Memory in the Hopfield Model
verfasst von : Iakov Karandashev, Boris Kryzhanovsky, Leonid Litinskii
Erschienen in: Engineering Applications of Neural Networks
Verlag: Springer Berlin Heidelberg
Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.
Wählen Sie Textabschnitte aus um mit Künstlicher Intelligenz passenden Patente zu finden. powered by
Markieren Sie Textabschnitte, um KI-gestützt weitere passende Inhalte zu finden. powered by
For the standard Hopfield model a catastrophic destruction of the memory has place when the last is overfull (so called
catastrophic forgetting
). We eliminate the catastrophic forgetting assigning different weights to input patterns. As the weights one can use the frequencies of appearance of the patterns during the learning process. We show that only patterns whose weights are larger than some critical weight would be recognized. The case of the weights that are the terms of a geometric series is studied in details. The theoretical results are in good agreement with computer simulations.