Abstract
Associative memory, i.e. storage and recall of information by association with other information, may be the simplest application of “collective” computation on a neural network. An information storage device is called an associative memory, if it permits the recall of information on the basis of a partial knowledge of its content, but without knowing its storage location. One also speaks of content-addressable memory.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Notes
The associative-memory model was studied by several researchers in the early 1970s [An72, Na72, Ko72, Am72]. It has enjoyed great popularity since Hopfield [Ho82] drew the analogy to physical spin systems.
If we had excluded the self-coupling of neuron i, the sum would run over only (N-1) terms, and the appropriate normalization factor in (3.6) would be.
This is a consequence of the central-limit theorem: the sum of a large number, N, of independent random variables will obey a Gaussian distribution centered at N times the mean value and having a variance which is √N times the variance of the original probability distribution. For our special case this can be deduced as follows (see, e.g., [Wi86a]). Let ξ i , i = 1,2, …, be discrete random variables taking the values ξ i = ±1 with equal probability. The task is to find an expression for the probability distribution p(x, N) of the sum. Clearly we have Furthermore, if we know p(x, N-1), we obtain p(x, N) by the convolution An explicit expression is obtained by means of the Fourier transform Since the Fourier transformation converts a convolution of two functions into the product of the Fourier transforms, i.e. p(k, N) = p(k, N-l)p(k, 1), we find that An approximate expression for large N can be found by noting that this limit corresponds to small values of k, where we can expand the cosine function: Finally showing that the distribution has the width Δx ∝ √N.
There are at most N different orthogonal patterns!
It is remarkable that Hebb’s rule (3.12) always leads to symmetric synaptic couplings, which is no longer true for more general learning rules.
This is the Lyapunov function of the system.
We note that it is possible to generate slow sequences of patterns without introducing an explicit temporal order through time-delayed synapses. Directed transitions between stored patterns can also be induced by the action of thermal noise [Bu87b]. The operation of networks at finite temperatures will be discussed in the next chapter.
Note that the reaction of the network to an external stimulus is very fast in this model, occurring on the elementary neural time scale of a few milliseconds.
Author information
Authors and Affiliations
Rights and permissions
Copyright information
© 1995 Springer-Verlag Berlin Heidelberg
About this chapter
Cite this chapter
Müller, B., Reinhardt, J., Strickland, M.T. (1995). Associative Memory. In: Neural Networks. Physics of Neural Networks. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-57760-4_3
Download citation
DOI: https://doi.org/10.1007/978-3-642-57760-4_3
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-60207-1
Online ISBN: 978-3-642-57760-4
eBook Packages: Springer Book Archive