Multi-layer perceptrons (MLPs) have been widely used in classification and regression task. How to improve the training speed of MLPs has been an interesting field of research. Instead of the classical method, we try to train MLPs by a MiniMin model which can ensure that the weights of the last layer are optimal at each step. Significant improvement on training speed has been made using our method for several big benchmark data sets.
Weitere Kapitel dieses Buchs durch Wischen aufrufen
- Training Multi-layer Perceptrons Using MiniMin Approach
- Springer Berlin Heidelberg