Skip to main content

Associative Memory

  • Chapter
Neural Networks

Part of the book series: Physics of Neural Networks ((NEURAL NETWORKS))

  • 969 Accesses

Abstract

Associative memory, i.e. storage and recall of information by association with other information, may be the simplest application of “collective” computation on a neural network. An information storage device is called an associative memory, if it permits the recall of information on the basis of a partial knowledge of its content, but without knowing its storage location. One also speaks of content-addressable memory.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

eBook
USD 16.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 16.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Notes

  1. The associative-memory model was studied by several researchers in the early 1970s [An72, Na72, Ko72, Am72]. It has enjoyed great popularity since Hopfield [Ho82] drew the analogy to physical spin systems.

    Google Scholar 

  2. If we had excluded the self-coupling of neuron i, the sum would run over only (N-1) terms, and the appropriate normalization factor in (3.6) would be.

    Google Scholar 

  3. This is a consequence of the central-limit theorem: the sum of a large number, N, of independent random variables will obey a Gaussian distribution centered at N times the mean value and having a variance which is √N times the variance of the original probability distribution. For our special case this can be deduced as follows (see, e.g., [Wi86a]). Let ξ i , i = 1,2, …, be discrete random variables taking the values ξ i = ±1 with equal probability. The task is to find an expression for the probability distribution p(x, N) of the sum. Clearly we have Furthermore, if we know p(x, N-1), we obtain p(x, N) by the convolution An explicit expression is obtained by means of the Fourier transform Since the Fourier transformation converts a convolution of two functions into the product of the Fourier transforms, i.e. p(k, N) = p(k, N-l)p(k, 1), we find that An approximate expression for large N can be found by noting that this limit corresponds to small values of k, where we can expand the cosine function: Finally showing that the distribution has the width Δx ∝ √N.

    Google Scholar 

  4. There are at most N different orthogonal patterns!

    Google Scholar 

  5. It is remarkable that Hebb’s rule (3.12) always leads to symmetric synaptic couplings, which is no longer true for more general learning rules.

    Google Scholar 

  6. This is the Lyapunov function of the system.

    Google Scholar 

  7. We note that it is possible to generate slow sequences of patterns without introducing an explicit temporal order through time-delayed synapses. Directed transitions between stored patterns can also be induced by the action of thermal noise [Bu87b]. The operation of networks at finite temperatures will be discussed in the next chapter.

    Google Scholar 

  8. Note that the reaction of the network to an external stimulus is very fast in this model, occurring on the elementary neural time scale of a few milliseconds.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

Copyright information

© 1995 Springer-Verlag Berlin Heidelberg

About this chapter

Cite this chapter

Müller, B., Reinhardt, J., Strickland, M.T. (1995). Associative Memory. In: Neural Networks. Physics of Neural Networks. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-57760-4_3

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-57760-4_3

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-60207-1

  • Online ISBN: 978-3-642-57760-4

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics