2006 | OriginalPaper | Buchkapitel
Improved Storage Capacity of Hebbian Learning Attractor Neural Network with Bump Formations
verfasst von : Kostadin Koroutchev, Elka Korutcheva
Erschienen in: Artificial Neural Networks – ICANN 2006
Verlag: Springer Berlin Heidelberg
Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.
Wählen Sie Textabschnitte aus um mit Künstlicher Intelligenz passenden Patente zu finden. powered by
Markieren Sie Textabschnitte, um KI-gestützt weitere passende Inhalte zu finden. powered by
Recently, bump formations in attractor neural networks with distance dependent connectivities has become of increasing interest for investigation in the field of biological and computational neuroscience. Although the distance dependent connectivity is common in biological networks, a common fault of these network is the sharp drop of the number of patterns
p
that can remembered, when the activity changes from global to bump-like, than effectively makes these networks low effective.
In this paper we represent a bump-based recursive network specially designed in order to increase its capacity, which is comparable with that of randomly connected sparse network. To this aim, we have tested a selection of 700 natural images on a network with
N
= 64
K
neurons with connectivity per neuron
C
. We have shown that the capacity of the network is of order of
C
, that is in accordance with the capacity of highly diluted network. Preserving the number of connections per neuron, a non-trivial behavior with the radius of the connectivity has been observed. Our results show that the decrement of the capacity of the bumpy network can be avoided.