Skip to main content
Log in

On learning and energy-entropy dependence in recurrent and nonrecurrent signed networks

  • Articles
  • Published:
Journal of Statistical Physics Aims and scope Submit manuscript

Abstract

Learning of patterns by neural networks obeying general rules of sensory transduction and of converting membrane potentials to spiking frequencies is considered. Any finite number of cellsA can sample a pattern playing on any finite number of cells ∇ without causing irrevocable sampling bias ifA = ℬ orA ∩ ℬ =

. Total energy transfer from inputs ofA to outputs of ℬ depends on the entropy of the input distribution. Pattern completion on recall trials can occur without destroying perfect memory even ifA = ℬ by choosing the signal thresholds sufficiently large. The mathematical results are global limit and oscillation theorems for a class of nonlinear functional-differential systems.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. S. Grossberg, “Some networks that can learn, remember, and reproduce any number of complicated space-time patterns (I),”J. Math. Mechanics (July 1969).

  2. S. Grossberg, “Some networks that can learn, remember, and reproduce any number of complicated space-time patterns (II),”SIAM J. Applied Math., submitted for publication.

  3. S. Grossberg, “How do overlapping nonrecurrent synaptic fields learn without mutual interference?” (in preparation).

  4. R. B. Livingston, “Brain mechanisms in conditioning and learning,” in:Neurosciences Research Symposium Summaries, F. O. Schmittet al., eds. (MIT Press, Cambridge, Massachusetts, 1967), Vol. 2.

    Google Scholar 

  5. S. Grossberg, “On neural pattern discrimination,”J, Theoret. Biol., submitted for publication.

  6. S. Grossberg, “On learning, information, lateral inhibition, and transmitters,”Math. Biosci. 4:255–310 (1969).

    Google Scholar 

  7. S. Grossberg, “A prediction theory for some nonlinear functional-differential equations (II),”J. Math. Anal. Appl. 22:490–522 (1968).

    Google Scholar 

  8. P. Hartman,Ordinary Differential Equations (John Wiley and Sons, New York, 1964).

    Google Scholar 

  9. S. Grossberg, “A prediction theory for some nonlinear functional-differential equations (I),”J. Math. Anal. Appl. 21:643–694 (1968).

    Google Scholar 

  10. S. Grossberg, “Some nonlinear networks capable of learning a spatial pattern of arbitrary complexity,”Proc. Natl. Acad. Sci. (U.S.) 59:368–372 (1968).

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Additional information

The preparation of this work was supported in part by the National Science Foundation (GP 9003), the Office of Naval Research (N00014-67-A-024-OQ16), and the A.P. Sloan Foundation.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Grossberg, S. On learning and energy-entropy dependence in recurrent and nonrecurrent signed networks. J Stat Phys 1, 319–350 (1969). https://doi.org/10.1007/BF01007484

Download citation

  • Received:

  • Issue Date:

  • DOI: https://doi.org/10.1007/BF01007484

Key words

Navigation