2014 | OriginalPaper | Buchkapitel
Self-generated Off-line Memory Reprocessing Strongly Improves Generalization in a Hierarchical Recurrent Neural Network
verfasst von : Jenia Jitsev
Erschienen in: Artificial Neural Networks and Machine Learning – ICANN 2014
Verlag: Springer International Publishing
Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.
Wählen Sie Textabschnitte aus um mit Künstlicher Intelligenz passenden Patente zu finden. powered by
Markieren Sie Textabschnitte, um KI-gestützt weitere passende Inhalte zu finden. powered by
Strong experimental evidence suggests that cortical memory traces are consolidated during off-line memory reprocessing that occurs in the off-line states of sleep or waking rest. It is unclear, what plasticity mechanisms are involved in this process and what changes are induced in the network in the off-line regime. Here, we examine a hierarchical recurrent neural network that performs unsupervised learning on natural face images of different persons. The proposed network is able to self-generate memory replay while it is decoupled from external stimuli. Remarkably, the recognition performance is tremendously boosted after this off-line regime specifically for the novel face views that were not shown during the initial learning. This effect is independent of synapse-specific plasticity, relying completely on homeostatic regulation of intrinsic excitability. Comparing a purely feed-forward network configuration with the full version reveals a substantially stronger boost in recognition performance for the fully recurrent network architecture after the off-line regime.