2013 | OriginalPaper | Buchkapitel
Overcoming the Local-Minimum Problem in Training Multilayer Perceptrons with the NRAE-MSE Training Method
verfasst von : James Ting-Ho Lo, Yichuan Gui, Yun Peng
Erschienen in: Advances in Neural Networks – ISNN 2013
Verlag: Springer Berlin Heidelberg
Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.
Wählen Sie Textabschnitte aus um mit Künstlicher Intelligenz passenden Patente zu finden. powered by
Markieren Sie Textabschnitte, um KI-gestützt weitere passende Inhalte zu finden. powered by
The normalized risk-averting error (NRAE) training method presented in ISNN 2012 is capable of overcoming the local-minimum problem in training neural networks. However, the overall success rate is unsatisfactory. Motivated by this problem, a modification, called the NRAE-MSE training method is herein proposed. The new method trains neural networks with respect to NRAE with a fixed
λ
in the range of 10
6
-10
11
, and takes excursions to train with the standard mean squared error (MSE) from time to time. Once an excursion produces a satisfactory MSE with cross-validation, the entire NRAE-MSE training stops. Numerical experiments show that the NRAE-MSE training method has a success rate of 100% in all the testing examples each starting with a large number of randomly selected initial weights.