Skip to main content
Erschienen in: Neural Computing and Applications 1/2006

01.03.2006 | Original Article

Combined learning and pruning for recurrent radial basis function networks based on recursive least square algorithms

verfasst von: Chi Sing Leung, Ah Chung Tsoi

Erschienen in: Neural Computing and Applications | Ausgabe 1/2006

Einloggen

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

This paper discusses a way to combine training and pruning for the construction of a recurrent radial basis function network (RRBFN) based on recursive least square (RLS) learning. In our approach, a RRBFN is first trained using the proposed RLS algorithms. Afterwards, the error covariance matrix which is directly obtained from the RLS computations is used to remove some unimportant radial basis function (RBF) nodes. We propose two algorithms: (1) a “global” version which is suitable for low dimensional input space situation, and (2) a “local” version which can be applied in situations when the input dimension is large. In both cases, it is shown that the error covariance matrix, obtained from the RLS algorithms, can be used as a means for pruning the trained RRBFN. Simulation examples are presented to illustrate the proposed approaches.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Literatur
1.
Zurück zum Zitat Ljung L (1987) System identification: theory for the user. Prentice Hall, Englewood CliffsMATH Ljung L (1987) System identification: theory for the user. Prentice Hall, Englewood CliffsMATH
2.
Zurück zum Zitat Gerhenfeld NA, Weigend AS (1993) The future of time series. In: Time series prediction: forecasting the future and understanding the past. Addison Wesley, Reading, pp 1–70 Gerhenfeld NA, Weigend AS (1993) The future of time series. In: Time series prediction: forecasting the future and understanding the past. Addison Wesley, Reading, pp 1–70
3.
Zurück zum Zitat Tong H (1989) Nonlinear time series. Oxford University Press, Oxford Tong H (1989) Nonlinear time series. Oxford University Press, Oxford
4.
Zurück zum Zitat Hecht-Nielsen R (1990) Neurocomputing. Addison-Wesley, Reading Hecht-Nielsen R (1990) Neurocomputing. Addison-Wesley, Reading
5.
Zurück zum Zitat Simpson PK (1990) Artificial neural systems: foundations, paradigms, applications, and implementations. Pergamon, New York Simpson PK (1990) Artificial neural systems: foundations, paradigms, applications, and implementations. Pergamon, New York
6.
Zurück zum Zitat Ripley B, Haykin S (1994) Neural networks: a comprehensive foundation. Macmillan College Publishing Company, New York Ripley B, Haykin S (1994) Neural networks: a comprehensive foundation. Macmillan College Publishing Company, New York
7.
Zurück zum Zitat Sejnowski T, Rosenberg R (1989) NETtalk: a parallel network that learns to read aloud. The Johns Hopkins University, Electrical and Computer Science Technical Report JHU/EECS–86/01, 32 pages. Reprinted in Neurocomputing Foundations of Research, Anderson JA, Rosenfeld E (eds) MIT, Cambridge, pp 663–672 Sejnowski T, Rosenberg R (1989) NETtalk: a parallel network that learns to read aloud. The Johns Hopkins University, Electrical and Computer Science Technical Report JHU/EECS–86/01, 32 pages. Reprinted in Neurocomputing Foundations of Research, Anderson JA, Rosenfeld E (eds) MIT, Cambridge, pp 663–672
8.
Zurück zum Zitat Hornik K, Stinchombe M, White H (1989) Multilayer feedforward networks are universal approximators. Neural Netw 2:359–366CrossRef Hornik K, Stinchombe M, White H (1989) Multilayer feedforward networks are universal approximators. Neural Netw 2:359–366CrossRef
9.
10.
Zurück zum Zitat Williams R, Zipser D (1989) A Learning algorithm for continually running fully recurrent neural networks. Neural Comput 1:270–280 Williams R, Zipser D (1989) A Learning algorithm for continually running fully recurrent neural networks. Neural Comput 1:270–280
11.
Zurück zum Zitat Robinson T, Fallside F (1989) A recurrent error propagation network speech recognition system. Computer Speech and Language, pp 259–274 Robinson T, Fallside F (1989) A recurrent error propagation network speech recognition system. Computer Speech and Language, pp 259–274
12.
Zurück zum Zitat Connor JT, Martin RD, Atlas LE (1994) Recurrent neural network and robust time series prediction. IEEE Trans Neural Netw 5:240–254CrossRefPubMed Connor JT, Martin RD, Atlas LE (1994) Recurrent neural network and robust time series prediction. IEEE Trans Neural Netw 5:240–254CrossRefPubMed
13.
Zurück zum Zitat Watrous RL (1992) Induction of finite-state languages using second-order recurrent networks. Neural Comput 4:406–414CrossRef Watrous RL (1992) Induction of finite-state languages using second-order recurrent networks. Neural Comput 4:406–414CrossRef
14.
Zurück zum Zitat Back AD, Tsoi AC (1991) FIR and IIR synapses, a new neural network architecture for time series modelling. Neural Comput 3:375–385 Back AD, Tsoi AC (1991) FIR and IIR synapses, a new neural network architecture for time series modelling. Neural Comput 3:375–385
15.
Zurück zum Zitat Tsoi AC, Back AD (1997) Discrete time recurrent neural network architectures, a unifying review. Neurocomputing 15(3–4):183–223CrossRefMATH Tsoi AC, Back AD (1997) Discrete time recurrent neural network architectures, a unifying review. Neurocomputing 15(3–4):183–223CrossRefMATH
16.
Zurück zum Zitat Tsoi AC, Back AD (1994) Locally recurrent globally feedforward networks, a critical review of architectures. IEEE Trans Neural Netw 5:229–239CrossRefPubMed Tsoi AC, Back AD (1994) Locally recurrent globally feedforward networks, a critical review of architectures. IEEE Trans Neural Netw 5:229–239CrossRefPubMed
17.
Zurück zum Zitat Nerrand O, Roussel-Ragot P, Personnaz L, Dreyfus G, Marocs S (1993) Neural networks and nonlinear adaptive filtering: unifying concepts and new algorithms. Neural Comput 5:165–199 Nerrand O, Roussel-Ragot P, Personnaz L, Dreyfus G, Marocs S (1993) Neural networks and nonlinear adaptive filtering: unifying concepts and new algorithms. Neural Comput 5:165–199
18.
Zurück zum Zitat Mulgrew B (1996) Applying radial basis functions. IEEE Signal Process Mag, March Mulgrew B (1996) Applying radial basis functions. IEEE Signal Process Mag, March
19.
Zurück zum Zitat Moody J, Darken C (1989) Fast learning in networks of locally tuned processing units. Neural Comput 1:281–294 Moody J, Darken C (1989) Fast learning in networks of locally tuned processing units. Neural Comput 1:281–294
20.
Zurück zum Zitat Moakes PA, Steve W (1994) Non-linear speech analysis using recurrent radial basis function networks. Neural Networks for Signal Processing-Proceedings of the IEEE Workshop 1994. IEEE, Piscataway, NJ, USA, pp 319–328 Moakes PA, Steve W (1994) Non-linear speech analysis using recurrent radial basis function networks. Neural Networks for Signal Processing-Proceedings of the IEEE Workshop 1994. IEEE, Piscataway, NJ, USA, pp 319–328
21.
Zurück zum Zitat Koivisto H, Vahteri J (1996) A training method for feedforward and recurrent RBF type neural network. STEP’96 Genes, Nets and Symbols. Finnish Artificial Intelligence Conference, Vassa, Finland, p 89 Koivisto H, Vahteri J (1996) A training method for feedforward and recurrent RBF type neural network. STEP’96 Genes, Nets and Symbols. Finnish Artificial Intelligence Conference, Vassa, Finland, p 89
22.
Zurück zum Zitat Billings SA, Fung CF (1995) Recurrent radial basis function networks for adaptive noise cancelation. Neural Netw 8:273–290CrossRef Billings SA, Fung CF (1995) Recurrent radial basis function networks for adaptive noise cancelation. Neural Netw 8:273–290CrossRef
23.
Zurück zum Zitat Chen S, Billings SA, Grant PM (1992) A recursive hybrid algorithm for nonlinear system identification using radial basis function networks for adaptive noise cancelation. Int J Control 5:1051–1070MathSciNet Chen S, Billings SA, Grant PM (1992) A recursive hybrid algorithm for nonlinear system identification using radial basis function networks for adaptive noise cancelation. Int J Control 5:1051–1070MathSciNet
24.
Zurück zum Zitat Mak MW (1995) A learning algorithm for recurrent radial basis function. Neural Process Lett 2:27–31ADS Mak MW (1995) A learning algorithm for recurrent radial basis function. Neural Process Lett 2:27–31ADS
25.
Zurück zum Zitat Cid-Sueiro J, Artes-Rodriguez A, Figueiras-Vidal AR (1994) Recurrent radial basis function networks for optimal symbol-by-symbol equalization. Signal Process 40:53–63CrossRefMATH Cid-Sueiro J, Artes-Rodriguez A, Figueiras-Vidal AR (1994) Recurrent radial basis function networks for optimal symbol-by-symbol equalization. Signal Process 40:53–63CrossRefMATH
26.
Zurück zum Zitat Ye X, Loh NK (1994) Dynamic system identification using recurrent radial basis function network. In: Proceedings of the 1993 American Control Conference, vol 3, pp 2912–2916 Ye X, Loh NK (1994) Dynamic system identification using recurrent radial basis function network. In: Proceedings of the 1993 American Control Conference, vol 3, pp 2912–2916
27.
Zurück zum Zitat Linde Y, Buzo A, Gray M (1980) An algorithm for vector quantizer design. IEEE Trans Commun 28:84–95CrossRef Linde Y, Buzo A, Gray M (1980) An algorithm for vector quantizer design. IEEE Trans Commun 28:84–95CrossRef
28.
Zurück zum Zitat Tsoi AC (1989) Multilayer perceptron trained using radial basis functions. Electron Lett 25:1296–1297 Tsoi AC (1989) Multilayer perceptron trained using radial basis functions. Electron Lett 25:1296–1297
29.
Zurück zum Zitat Singhal S, Wu L (1989) Training feed–forward networks with the extended Kalman filter. In: Proceedings IEEE international conference ASSP, Glasgow, pp 1187–1190 Singhal S, Wu L (1989) Training feed–forward networks with the extended Kalman filter. In: Proceedings IEEE international conference ASSP, Glasgow, pp 1187–1190
30.
Zurück zum Zitat Shah S, Palmieri F, Datum M (1992) Optimal filtering algorithm for fast learning in feedforward neural networks. Neural Netw 5:779–787CrossRef Shah S, Palmieri F, Datum M (1992) Optimal filtering algorithm for fast learning in feedforward neural networks. Neural Netw 5:779–787CrossRef
31.
Zurück zum Zitat Leung CS, Wong KW, Sum J, Chan LW (1996) On-line training and pruning for RLS algorithms. Electron Lett pp 2152–2153 Leung CS, Wong KW, Sum J, Chan LW (1996) On-line training and pruning for RLS algorithms. Electron Lett pp 2152–2153
32.
Zurück zum Zitat Giles CL, Omlin CW (1994) Pruning recurrent neural networks for improved generalization performance. IEEE Trans Neural Netw 5:848–851CrossRefPubMed Giles CL, Omlin CW (1994) Pruning recurrent neural networks for improved generalization performance. IEEE Trans Neural Netw 5:848–851CrossRefPubMed
34.
Zurück zum Zitat Tsoi AC, Tan S (1997) Recurrent neural networks: a constructive algorithm, and its properties. Neurocomputing 15:309–326CrossRef Tsoi AC, Tan S (1997) Recurrent neural networks: a constructive algorithm, and its properties. Neurocomputing 15:309–326CrossRef
35.
Zurück zum Zitat LeCun Y, Denker JS, Solla SA (1989) Optimal brain damage. In: Touretzky DS (ed) Advances in neural information processing, vol 2, pp 598–605 LeCun Y, Denker JS, Solla SA (1989) Optimal brain damage. In: Touretzky DS (ed) Advances in neural information processing, vol 2, pp 598–605
36.
Zurück zum Zitat Goodwin GC, Sin KS (1986) Adaptive estimation, filtering, and control. Prentice Hall, Englewood Goodwin GC, Sin KS (1986) Adaptive estimation, filtering, and control. Prentice Hall, Englewood
37.
Zurück zum Zitat Narenda KS, Parthasarathy K (1990) Identification and control of dynamical systems using neural networks. IEEE Trans Neural Netw 1:4–27CrossRef Narenda KS, Parthasarathy K (1990) Identification and control of dynamical systems using neural networks. IEEE Trans Neural Netw 1:4–27CrossRef
38.
Zurück zum Zitat Shynk JJ (1989) Adaptive IIR filtering. IEEE ASSP Mag, April, 1989 Shynk JJ (1989) Adaptive IIR filtering. IEEE ASSP Mag, April, 1989
39.
Zurück zum Zitat Puskorius GV, Feldkamp LA (1994) Neurocontrol of nonlinear dynamical system with Kalman filter trained recurrent networks. IEEE Trans Neural Netw 5:279–297CrossRefPubMed Puskorius GV, Feldkamp LA (1994) Neurocontrol of nonlinear dynamical system with Kalman filter trained recurrent networks. IEEE Trans Neural Netw 5:279–297CrossRefPubMed
40.
Zurück zum Zitat William H (1989) Applied numerical linear algebra. Prentice-Hall, Englewood Cliffs William H (1989) Applied numerical linear algebra. Prentice-Hall, Englewood Cliffs
41.
Zurück zum Zitat Narenda KS, Parthasarathy K (1992) Neural networks and dynamical systems. Int J Approx Reason 6:109–131CrossRef Narenda KS, Parthasarathy K (1992) Neural networks and dynamical systems. Int J Approx Reason 6:109–131CrossRef
42.
Zurück zum Zitat Weigend AS, Huberman BA, Rumelhart DE (1990) Predicting the future: a connectionist approach. Int J Neural Syst 1:193–209CrossRef Weigend AS, Huberman BA, Rumelhart DE (1990) Predicting the future: a connectionist approach. Int J Neural Syst 1:193–209CrossRef
43.
Zurück zum Zitat Weigend AS, Rumelhart DE, Huberman BA (1990) Back-propagation, weight-elimination and time series prediction. In: Touretzky DS, Elman JL, Sejnowski TJ, Hinton GE (eds) Proceedings of the 1990 Connectionist Models Summer School. Morgan Kaufmann, San Mateo, pp 105–116 Weigend AS, Rumelhart DE, Huberman BA (1990) Back-propagation, weight-elimination and time series prediction. In: Touretzky DS, Elman JL, Sejnowski TJ, Hinton GE (eds) Proceedings of the 1990 Connectionist Models Summer School. Morgan Kaufmann, San Mateo, pp 105–116
44.
Zurück zum Zitat Jones AJ (2004) New tools in non-linear modelling and prediction. Comput Manag Sci 1(2):109–149MATH Jones AJ (2004) New tools in non-linear modelling and prediction. Comput Manag Sci 1(2):109–149MATH
45.
Zurück zum Zitat Svarer C, Hansen LK, Larsen J (1993) On design and evaluation of tapped-delay neural network architectures. In: Proceedings of IEEE international conference on neural networks, pp 46–51 Svarer C, Hansen LK, Larsen J (1993) On design and evaluation of tapped-delay neural network architectures. In: Proceedings of IEEE international conference on neural networks, pp 46–51
46.
Zurück zum Zitat Leung CT, Chow TWS (1997) A novel noise robust fourth-order cumulants cost function. Neurocomputing 16:139–147CrossRef Leung CT, Chow TWS (1997) A novel noise robust fourth-order cumulants cost function. Neurocomputing 16:139–147CrossRef
Metadaten
Titel
Combined learning and pruning for recurrent radial basis function networks based on recursive least square algorithms
verfasst von
Chi Sing Leung
Ah Chung Tsoi
Publikationsdatum
01.03.2006
Verlag
Springer Verlag
Erschienen in
Neural Computing and Applications / Ausgabe 1/2006
Print ISSN: 0941-0643
Elektronische ISSN: 1433-3058
DOI
https://doi.org/10.1007/s00521-005-0009-7

Weitere Artikel der Ausgabe 1/2006

Neural Computing and Applications 1/2006 Zur Ausgabe

Premium Partner