2004 | OriginalPaper | Buchkapitel
Adaptive Transfer Functions in Radial Basis Function (RBF) Networks
verfasst von : Günther A. Hoffmann
Erschienen in: Computational Science - ICCS 2004
Verlag: Springer Berlin Heidelberg
Enthalten in: Professional Book Archive
Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.
Wählen Sie Textabschnitte aus um mit Künstlicher Intelligenz passenden Patente zu finden. powered by
Markieren Sie Textabschnitte, um KI-gestützt weitere passende Inhalte zu finden. powered by
The quality of Radial Basis Functions (RBF) and other nonlinear learning networks such as Multi Layer Perceptrons (MLP) depend significantly on issues in architecture, learning algorithms, initialisation heuristics and regularization techniques. Little attention has been given to the effect of mixture transfer functions in RBF networks on model quality and efficiency of parameter optimisation. We propose Universal Basis Functions (UBF) with flexible activation functions which are parameterised to change their shape smoothly from one functional form to another. This way they can cover bounded and unbounded subspaces depending on data distribution. We define UBF and apply them to a number of classification and function approximation tasks. We find that the UBF approach outperforms traditional RBF with the Hermite data set, a noisy Fourier series and a non f-separable classification problem, however it does not improve statistically significant on the Mackey-Glass chaotic time series. The paper concludes with comments and issues for future research.