Elsevier

Neural Networks

Volume 5, Issue 4, July–August 1992, Pages 565-576
Neural Networks

Original Contribution
The gamma model—A new neural model for temporal processing

https://doi.org/10.1016/S0893-6080(05)80035-8Get rights and content

Abstract

In this paper we develop the gamma neural model, a new neural net architecture for processing of temporal patterns. Time varying patterns are normally segmented into a sequence of static patterns that are successively presented to a neural net. In the approach presented here segmentation is avoided. Only current signal values are presented to the neural net, which adapts its own internal memory to store the past. Thus, in the gamma neural net, an adaptive short term mechanism obviates a priori signal segmentation. We evaluate the relation between the gamma net and competing dynamic neural models. Interestingly, the gamma model brings many popular dynamic net architectures, such as the time-delay-neural-net and the concentration-in-time-neural-net, into a unifying framework. In fact, the gamma memory structure appears as general as a temporal convolution memory structure with arbitrary time varying weight kernel w(t). Yet, the gamma model remains mathematically equivalent to the additive (Grossberg) model with constant weights. We present a back propagation procedure to adapt the weights in a particular feedforward structure, the focused gamma net.

References (27)

  • BodenhausenU. et al.

    Learning the architecture of neural networks for speech recognition

  • BraunM.

    Ordinary differential equations and their applications

    (1983)
  • de VriesB. et al.

    A theory for neural nets with time delays

  • ElmanJ.L.

    Finding structure in time

    Cognitive Science

    (1990)
  • FarqueM.D.

    Reductibilite des systemes hereditaires a des systemes dynamiques (regis par des equations differentielles ou aux derivees partielles)

    C.R. Acad. Sc. Paris

    (1973)
  • FukushimaK.

    Neocognitron: A self-organizing neural network model for a mechanism of pattern recognition by shift in position

    Biological Cybernetics

    (1980)
  • GoriM. et al.

    BPS: A learning algorithm for capturing the dynamic nature of speech

  • GrossbergS.

    Studies of mind and brain

    (1982)
  • JordanM.I.

    Atractor dynamics and parallelism in a connectionist sequential machine

  • HertzJ. et al.

    Introduction to the theory of neural computation

    (1991)
  • LangK. et al.

    A time-delay neural network architecture for isolated word recognition

    Neural Networks

    (1990)
  • MillerR.

    Representation of brief temporal patterns, Hebbian synapses, and the left-hemisphere dominance for phoneme recognition

    Psychobiology

    (1987)
  • MozerM.C.

    A focused back propagation algorithm for temporal pattern recognition

    Complex Systems

    (1989)
  • Cited by (196)

    • Persistence in complex systems

      2022, Physics Reports
    View all citing articles on Scopus
    *

    Current address: David Sarnoff Research Center, CN5300, Princeton, NJ 08543-5300.

    View full text