2010 | OriginalPaper | Chapter
Generalization Error of Faulty MLPs with Weight Decay Regularizer
Authors : Chi Sing Leung, John Sum, Shue Kwan Mak
Published in: Neural Information Processing. Models and Applications
Publisher: Springer Berlin Heidelberg
Activate our intelligent search to find suitable subject content or patents.
Select sections of text to find matching patents with Artificial Intelligence. powered by
Select sections of text to find additional relevant content using AI-assisted search. powered by
Weight decay is a simple regularization method to improve the generalization ability of multilayered perceptrons (MLPs). Besides, the weight decay method can also improve the fault tolerance of MLPs. However, most existing generalization error results of using the weight decay method focus on fault-free MLPs only. For faulty MLPs, using a test set to study the generalization ability is not practice because there are huge number of possible faulty networks for a trained network. This paper develops a prediction error formula for predicting the performance of faulty MLPs. Our prediction error results allows us to select an appropriate model for MLPs under open node fault situation.