2012 | OriginalPaper | Buchkapitel
Regularization Techniques to Improve Generalization
verfasst von : Klaus-Robert Müller
Erschienen in: Neural Networks: Tricks of the Trade
Verlag: Springer Berlin Heidelberg
Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.
Wählen Sie Textabschnitte aus um mit Künstlicher Intelligenz passenden Patente zu finden. powered by
Markieren Sie Textabschnitte, um KI-gestützt weitere passende Inhalte zu finden. powered by
Good tricks for regularization are extremely important for improving the generalization ability of neural networks. The first and most commonly used trick is
early stopping
, which was originally described in [
11
]. In its simplest version, the trick is as follows:
Take an independent validation set, e.g. take out a part of the training set, and monitor the error on this set during training. The error on the training set will decrease, whereas the error on the validation set will first decrease and then increase. The early stopping point occurs where the error on the validation set is the lowest. It is here that the network weights provide the best generalization
.