Skip to main content

1995 | OriginalPaper | Buchkapitel

Generic Back-Propagation in Arbitrary FeedForward Neural Networks

verfasst von : Cédric Gégout, Bernard Girau, Fabrice Rossi

Erschienen in: Artificial Neural Nets and Genetic Algorithms

Verlag: Springer Vienna

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

In this paper, we describe a general mathematical model for feedforward neural networks. The final form of the network is a vectorial function f of two variables, x (the input of the network) and w (the weight vector). We show that the differential of f can be computed with an extended back-propagation algorithm or with a direct method. By evaluating the time needed to compute the differential with the help of both methods, we show how to chose the best one. We introduce also input sharing and output function which allow us to implement efficiently a multilayer perceptron with our model.

Metadaten
Titel
Generic Back-Propagation in Arbitrary FeedForward Neural Networks
verfasst von
Cédric Gégout
Bernard Girau
Fabrice Rossi
Copyright-Jahr
1995
Verlag
Springer Vienna
DOI
https://doi.org/10.1007/978-3-7091-7535-4_45

Neuer Inhalt