1995 | OriginalPaper | Buchkapitel
Generic Back-Propagation in Arbitrary FeedForward Neural Networks
verfasst von : Cédric Gégout, Bernard Girau, Fabrice Rossi
Erschienen in: Artificial Neural Nets and Genetic Algorithms
Verlag: Springer Vienna
Enthalten in: Professional Book Archive
Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.
Wählen Sie Textabschnitte aus um mit Künstlicher Intelligenz passenden Patente zu finden. powered by
Markieren Sie Textabschnitte, um KI-gestützt weitere passende Inhalte zu finden. powered by
In this paper, we describe a general mathematical model for feedforward neural networks. The final form of the network is a vectorial function f of two variables, x (the input of the network) and w (the weight vector). We show that the differential of f can be computed with an extended back-propagation algorithm or with a direct method. By evaluating the time needed to compute the differential with the help of both methods, we show how to chose the best one. We introduce also input sharing and output function which allow us to implement efficiently a multilayer perceptron with our model.