Skip to main content

2004 | OriginalPaper | Buchkapitel

Adaptive Transfer Functions in Radial Basis Function (RBF) Networks

verfasst von : Günther A. Hoffmann

Erschienen in: Computational Science - ICCS 2004

Verlag: Springer Berlin Heidelberg

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

The quality of Radial Basis Functions (RBF) and other nonlinear learning networks such as Multi Layer Perceptrons (MLP) depend significantly on issues in architecture, learning algorithms, initialisation heuristics and regularization techniques. Little attention has been given to the effect of mixture transfer functions in RBF networks on model quality and efficiency of parameter optimisation. We propose Universal Basis Functions (UBF) with flexible activation functions which are parameterised to change their shape smoothly from one functional form to another. This way they can cover bounded and unbounded subspaces depending on data distribution. We define UBF and apply them to a number of classification and function approximation tasks. We find that the UBF approach outperforms traditional RBF with the Hermite data set, a noisy Fourier series and a non f-separable classification problem, however it does not improve statistically significant on the Mackey-Glass chaotic time series. The paper concludes with comments and issues for future research.

Metadaten
Titel
Adaptive Transfer Functions in Radial Basis Function (RBF) Networks
verfasst von
Günther A. Hoffmann
Copyright-Jahr
2004
Verlag
Springer Berlin Heidelberg
DOI
https://doi.org/10.1007/978-3-540-24687-9_102

Premium Partner