A MATLAB-Based Study on Approximation Performances of Improved Algorithms of Typical BP Neural Networks

Article Preview

Abstract:

BP neural networks are widely used and the algorithms are various. This paper studies the advantages and disadvantages of improved algorithms of five typical BP networks, based on artificial neural network theories. First, the learning processes of improved algorithms of the five typical BP networks are elaborated on mathematically. Then a specific network is designed on the platform of MATLAB 7.0 to conduct approximation test for a given nonlinear function. At last, a comparison is made between the training speeds and memory consumption of the five BP networks. The simulation results indicate that for small scaled and medium scaled networks, LM optimization algorithm has the best approximation ability, followed by Quasi-Newton algorithm, conjugate gradient method, resilient BP algorithm, adaptive learning rate algorithm. Keywords: BP neural network; Improved algorithm; Function approximation; MATLAB

You might also be interested in these eBooks

Info:

Periodical:

Pages:

1353-1356

Citation:

Online since:

March 2013

Export:

Price:

[1] Rong Xie,Xinmin Wang,Yan Li and Kairui Zhao. Research and Application on Improved BP Neural Network Algorithm, 2010 5th IEEE Conference on Industrial Electronics and Application, Vol. 3 (2010), pp.1462-1466

DOI: 10.1109/iciea.2010.5514820

Google Scholar

[2] SU Gao-Li,DENG Fang-Ping. On the Improving Backpropagation Algorithms of the Neural Networks Based on MATLAB Language[J].Bulletin of Science and Technology, Vol. 19 (2003), pp.130-135. (in Chinese)

Google Scholar

[3] MoIIer M F. A scaled conjugate gradient algorithm for fast supervised Learning[J]. Neural Networks,Vol.6(1993), pp.525-533

DOI: 10.1016/s0893-6080(05)80056-5

Google Scholar