Skip to main content

Coupled Neural Networks

  • Chapter
Neural Networks

Part of the book series: Physics of Neural Networks ((NEURAL NETWORKS))

  • 968 Accesses

Abstract

Multilayered feed-forward networks (perceptrons) are special cases of the general McCulloch-Pitts neural network with arbitrarily interconnected neurons. On the other hand, any general “recurrent” neural network can be considered to be represente by a feed-forward perceptron, albeit one with possibly very many layers. The reason for this strange equivalence is that the temporal evolution (3.5) of an arbitrary network constructed from binary neurons is necessarily periodic. This statement follows immediately from the observation that the N neurons can only assume 2N configurations altogether, and hence some state of the network must reoccur after at most 2N steps. Since only the present state of the network enters on the right-hand side of the evolution law (14.1), the subsequent evolution proceeds strictly periodically from that moment on. If one considers the neural network at a certain moment t = n as the nth layer of a perceptron (with all layers identical!), the temporal-evolution law can be viewed as the law governing the flow of information from one layer to the next. It is then sufficient to take into account only a finite number of such layers, just as many as there are time steps leading up to the first repetition of a network configuration.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

eBook
USD 16.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 16.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Notes

  1. It is common to treat only some of the neurons as receptor neurons; then the I i do not vanish only for those neurons.

    Google Scholar 

  2. If the fixed-point equation (14.3) has more than one solution, it may well depend on the start configuration which fixpoint is reached. Here we do not consider this case of multi-stability further, and concentrate on a single task to be learned by the network.

    Google Scholar 

  3. The same complication occurs if lateral synaptic connections between the neurons contained in the same layer of a perceptron are allowed.

    Google Scholar 

  4. The existence of a stationary state of the assistant network is guaranteed if the original network has a fixed point, since its dynamics corresponds to that of the linearized original network in the vicinity of its fixed point, but run backwards in time. The matrix w ik of the synaptic connections of the assistant network contains the transpose of the synaptic matrix w ki of the original neural net, which means that the directions of all synapses have been reversed.

    Google Scholar 

  5. In a similar differential equation was found to describe an electronic network of coupled nonlinear circuits.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

Copyright information

© 1995 Springer-Verlag Berlin Heidelberg

About this chapter

Cite this chapter

Müller, B., Reinhardt, J., Strickland, M.T. (1995). Coupled Neural Networks. In: Neural Networks. Physics of Neural Networks. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-57760-4_14

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-57760-4_14

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-60207-1

  • Online ISBN: 978-3-642-57760-4

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics