Skip to main content
Erschienen in: Neural Computing and Applications 7-8/2013

01.12.2013 | ISNN2012

Kernelized LARS–LASSO for constructing radial basis function neural networks

verfasst von: Quan Zhou, Shiji Song, Cheng Wu, Gao Huang

Erschienen in: Neural Computing and Applications | Ausgabe 7-8/2013

Einloggen

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

Model structure selection is of crucial importance in radial basis function (RBF) neural networks. Existing model structure selection algorithms are essentially forward selection or backward elimination methods that may lead to sub-optimal models. This paper proposes an alternative selection procedure based on the kernelized least angle regression (LARS)–least absolute shrinkage and selection operator (LASSO) method. By formulating the RBF neural network as a linear-in-the-parameters model, we derive a l 1-constrained objective function for training the network. The proposed algorithm makes it possible to dynamically drop a previously selected regressor term that is insignificant. Furthermore, inspired by the idea of LARS, the computing of output weights in our algorithm is greatly simplified. Since our proposed algorithm can simultaneously conduct model structure selection and parameter optimization, a network with better generalization performance is built. Computational experiments with artificial and real world data confirm the efficacy of the proposed algorithm.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Literatur
1.
Zurück zum Zitat Adeney K, Korenberg M (2000) Iterative fast orthogonal search algorithm for MDL-based training of generalized single-layer networks. Neural Netw 13(7):787–799CrossRef Adeney K, Korenberg M (2000) Iterative fast orthogonal search algorithm for MDL-based training of generalized single-layer networks. Neural Netw 13(7):787–799CrossRef
3.
Zurück zum Zitat Barreto A, Barbosa H, Ebecken N (2006) Golsgenetic orthogonal least squares algorithm for training rbf networks. Neurocomputing 69(16):2041–2064CrossRef Barreto A, Barbosa H, Ebecken N (2006) Golsgenetic orthogonal least squares algorithm for training rbf networks. Neurocomputing 69(16):2041–2064CrossRef
4.
5.
Zurück zum Zitat Chen S, Cowan C, Grant P (1991) Orthogonal least squares learning algorithm for radial basis function networks. IEEE Trans Neural Netw 2(2):302–309CrossRef Chen S, Cowan C, Grant P (1991) Orthogonal least squares learning algorithm for radial basis function networks. IEEE Trans Neural Netw 2(2):302–309CrossRef
6.
Zurück zum Zitat Chu W, Keerthi S, Ong C (2004) Bayesian support vector regression using a unified loss function. IEEE Trans Neural Netw 15(1):29–44CrossRef Chu W, Keerthi S, Ong C (2004) Bayesian support vector regression using a unified loss function. IEEE Trans Neural Netw 15(1):29–44CrossRef
7.
Zurück zum Zitat Deng J, Li K, Irwin G (2012) Locally regularised two-stage learning algorithm for RBF network centre selection. Int J Syst Sci 43(6):1157–1170MathSciNetCrossRef Deng J, Li K, Irwin G (2012) Locally regularised two-stage learning algorithm for RBF network centre selection. Int J Syst Sci 43(6):1157–1170MathSciNetCrossRef
10.
Zurück zum Zitat Furnival G, Wilson R Jr (1974) Regressions by leaps and bounds. Technometrics 16(4):499–511 Furnival G, Wilson R Jr (1974) Regressions by leaps and bounds. Technometrics 16(4):499–511
11.
Zurück zum Zitat Gomm J, Yu D (2000) Selecting radial basis function network centers with recursive orthogonal least squares training. IEEE Trans Neural Netw 11(2):306–314CrossRef Gomm J, Yu D (2000) Selecting radial basis function network centers with recursive orthogonal least squares training. IEEE Trans Neural Netw 11(2):306–314CrossRef
12.
Zurück zum Zitat Hong X, Mitchell R, Chen S, Harris C, Li K, Irwin G (2008) Model selection approaches for non-linear system identification: a review. Int J Syst Sci 39(10):925–946MathSciNetCrossRefMATH Hong X, Mitchell R, Chen S, Harris C, Li K, Irwin G (2008) Model selection approaches for non-linear system identification: a review. Int J Syst Sci 39(10):925–946MathSciNetCrossRefMATH
13.
14.
Zurück zum Zitat Li K, Peng J (2006) System oriented neural networks-problem formulation, methodology and application. Int J Pattern Recogn Artif Intell 20:143–158CrossRef Li K, Peng J (2006) System oriented neural networks-problem formulation, methodology and application. Int J Pattern Recogn Artif Intell 20:143–158CrossRef
15.
Zurück zum Zitat Li K, Peng J, Bai E (2009) Two-stage mixed discrete–continuous identification of radial basis function (RBF) neural models for nonlinear systems. IEEE Trans Circuits Syst I Regul Pap 56(3):630–643MathSciNetCrossRef Li K, Peng J, Bai E (2009) Two-stage mixed discrete–continuous identification of radial basis function (RBF) neural models for nonlinear systems. IEEE Trans Circuits Syst I Regul Pap 56(3):630–643MathSciNetCrossRef
16.
Zurück zum Zitat Liu Z, Li W, Sun W (2011) A novel method of short-term load forecasting based on multiwavelet transform and multiple neural networks. Neural Comput Appl 1–7. doi:10.1007/s00521-011-0715-2 Liu Z, Li W, Sun W (2011) A novel method of short-term load forecasting based on multiwavelet transform and multiple neural networks. Neural Comput Appl 1–7. doi:10.​1007/​s00521-011-0715-2
17.
Zurück zum Zitat Park J, Sandberg I (1991) Universal approximation using radial-basis-function networks. Neural Comput 3(2):246–257CrossRef Park J, Sandberg I (1991) Universal approximation using radial-basis-function networks. Neural Comput 3(2):246–257CrossRef
18.
Zurück zum Zitat Peng J, Li K, Huang D (2006) A hybrid forward algorithm for rbf neural network construction. IEEE Trans Neural Netw 17(6):1439–1451CrossRef Peng J, Li K, Huang D (2006) A hybrid forward algorithm for rbf neural network construction. IEEE Trans Neural Netw 17(6):1439–1451CrossRef
21.
Zurück zum Zitat Sheikhan M, Pardis R, Gharavian D (2012) State of charge neural computational models for high energy density batteries in electric vehicles. Neural Comput Appl 1–10. doi:10.1007/s00521-012-0883-8 Sheikhan M, Pardis R, Gharavian D (2012) State of charge neural computational models for high energy density batteries in electric vehicles. Neural Comput Appl 1–10. doi:10.​1007/​s00521-012-0883-8
22.
Zurück zum Zitat Sherstinsky A, Picard R (1996) On the efficiency of the orthogonal least squares training method for radial basis function networks. IEEE Trans Neural Netw 7(1):195–200CrossRef Sherstinsky A, Picard R (1996) On the efficiency of the orthogonal least squares training method for radial basis function networks. IEEE Trans Neural Netw 7(1):195–200CrossRef
23.
Zurück zum Zitat Tibshirani R (1996) Regression shrinkage and selection via the lasso. J R Stat Soc Ser B (Methodol) 58(1):267–288 Tibshirani R (1996) Regression shrinkage and selection via the lasso. J R Stat Soc Ser B (Methodol) 58(1):267–288
24.
Zurück zum Zitat Tibshirani R (2011) Regression shrinkage and selection via the lasso: a retrospective. J R Stat Soc Ser B (Methodol) 73(3):273–282MathSciNetCrossRef Tibshirani R (2011) Regression shrinkage and selection via the lasso: a retrospective. J R Stat Soc Ser B (Methodol) 73(3):273–282MathSciNetCrossRef
25.
Zurück zum Zitat Wang G, Yeung D, Lochovsky F (2007) The kernel path in kernelized lasso. In: International conference on artificial intelligence and statistics, pp 580–587 Wang G, Yeung D, Lochovsky F (2007) The kernel path in kernelized lasso. In: International conference on artificial intelligence and statistics, pp 580–587
26.
Zurück zum Zitat Zheng G, Billings S (1996) Radial basis function network configuration using mutual information and the orthogonal least squares algorithm. Neural Netw 9(9):1619–1637CrossRef Zheng G, Billings S (1996) Radial basis function network configuration using mutual information and the orthogonal least squares algorithm. Neural Netw 9(9):1619–1637CrossRef
Metadaten
Titel
Kernelized LARS–LASSO for constructing radial basis function neural networks
verfasst von
Quan Zhou
Shiji Song
Cheng Wu
Gao Huang
Publikationsdatum
01.12.2013
Verlag
Springer London
Erschienen in
Neural Computing and Applications / Ausgabe 7-8/2013
Print ISSN: 0941-0643
Elektronische ISSN: 1433-3058
DOI
https://doi.org/10.1007/s00521-012-1189-6

Weitere Artikel der Ausgabe 7-8/2013

Neural Computing and Applications 7-8/2013 Zur Ausgabe

Premium Partner