Skip to main content
Erschienen in: Neural Computing and Applications 3-4/2013

01.03.2013 | Extreme Learning Machine's Theory & Application

A modified extreme learning machine with sigmoidal activation functions

verfasst von: Zhixiang X. Chen, Houying Y. Zhu, Yuguang G. Wang

Erschienen in: Neural Computing and Applications | Ausgabe 3-4/2013

Einloggen

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

This paper proposes a modified ELM algorithm that properly selects the input weights and biases before training the output weights of single-hidden layer feedforward neural networks with sigmoidal activation function and proves mathematically the hidden layer output matrix maintains full column rank. The modified ELM avoids the randomness compared with the ELM. The experimental results of both regression and classification problems show good performance of the modified ELM algorithm.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Literatur
2.
Zurück zum Zitat Funahashi KI (1989) On the approximate realization of continuous mappings by neural networks. Neural Netw 2:183–192CrossRef Funahashi KI (1989) On the approximate realization of continuous mappings by neural networks. Neural Netw 2:183–192CrossRef
3.
Zurück zum Zitat Hornik K (1991) Approximation capabilities of multilayer feedforward networks. Neural Netw 4:251–257CrossRef Hornik K (1991) Approximation capabilities of multilayer feedforward networks. Neural Netw 4:251–257CrossRef
4.
Zurück zum Zitat Cao FL, Xie TF, Xu ZB (2008) The estimate for approximation error of neural networks: a constructive approach. Neurocomputing 71:626–630CrossRef Cao FL, Xie TF, Xu ZB (2008) The estimate for approximation error of neural networks: a constructive approach. Neurocomputing 71:626–630CrossRef
5.
Zurück zum Zitat Cao FL, Zhang YQ, He ZR (2009) Interpolation and rate of convergence by a class of neural networks. Appl Math Model 33:1441–1456MathSciNetMATHCrossRef Cao FL, Zhang YQ, He ZR (2009) Interpolation and rate of convergence by a class of neural networks. Appl Math Model 33:1441–1456MathSciNetMATHCrossRef
6.
Zurück zum Zitat Cao FL, Zhang R (2009) The errors of approximation for feedforward neural networks in the Lp metric. Math Comput Model 49:1563–1572MathSciNetMATHCrossRef Cao FL, Zhang R (2009) The errors of approximation for feedforward neural networks in the Lp metric. Math Comput Model 49:1563–1572MathSciNetMATHCrossRef
7.
Zurück zum Zitat Cao FL, Lin SB, Xu ZB (2010) Approximation capabilities of interpolation neural networks. Neurocomputing 74:457–460CrossRef Cao FL, Lin SB, Xu ZB (2010) Approximation capabilities of interpolation neural networks. Neurocomputing 74:457–460CrossRef
10.
Zurück zum Zitat Chen TP, Chen H (1995) Approximation capability to functions of several variables, nonlinear functionals, and operators by radial basis function neural networks. IEEE Trans Neural Netw 6:904–910CrossRef Chen TP, Chen H (1995) Approximation capability to functions of several variables, nonlinear functionals, and operators by radial basis function neural networks. IEEE Trans Neural Netw 6:904–910CrossRef
11.
Zurück zum Zitat Chen TP, Chen H (1995) Universal approximation to nonlinear operators by neural networks with arbitrary activation functions and its application to dynamical systems. IEEE Trans Neural Netw 6:911–917CrossRef Chen TP, Chen H (1995) Universal approximation to nonlinear operators by neural networks with arbitrary activation functions and its application to dynamical systems. IEEE Trans Neural Netw 6:911–917CrossRef
13.
Zurück zum Zitat Lan Y, Soh YC, Huang GB (2010, April) Random search enhancement of error minimized extreme learning machine. In: ESANN 2010 proceedings, European symposium on artificial neural networks—computational intelligence and machine learning, pp 327–332 Lan Y, Soh YC, Huang GB (2010, April) Random search enhancement of error minimized extreme learning machine. In: ESANN 2010 proceedings, European symposium on artificial neural networks—computational intelligence and machine learning, pp 327–332
14.
Zurück zum Zitat Huang GB, Babri HA (1998) Upper bounds on the number of hidden neurons in feedforward networks with arbitrary bounded nonlinear activation functions. IEEE Trans Neural Netw 9:224–229CrossRef Huang GB, Babri HA (1998) Upper bounds on the number of hidden neurons in feedforward networks with arbitrary bounded nonlinear activation functions. IEEE Trans Neural Netw 9:224–229CrossRef
15.
Zurück zum Zitat Huang GB, Zhu QY, Siew CK (2006) Extreme learning machine: theory and applications. Neurocomputing 70:489–501CrossRef Huang GB, Zhu QY, Siew CK (2006) Extreme learning machine: theory and applications. Neurocomputing 70:489–501CrossRef
16.
Zurück zum Zitat Huang GB, Zhu QY, Siew CK (2004) Extreme learning machine: a new learning scheme of feedforward neural networks. In: 2004 IEEE international joint conference on neural networks, vol 2, pp 985–990 Huang GB, Zhu QY, Siew CK (2004) Extreme learning machine: a new learning scheme of feedforward neural networks. In: 2004 IEEE international joint conference on neural networks, vol 2, pp 985–990
17.
Zurück zum Zitat Bartlett PL (1998) The sample complexity of pattern classification with neural networks: the size of the weights is more important than the size of the network. IEEE Trans Inf Theory 44:525–536MathSciNetMATHCrossRef Bartlett PL (1998) The sample complexity of pattern classification with neural networks: the size of the weights is more important than the size of the network. IEEE Trans Inf Theory 44:525–536MathSciNetMATHCrossRef
18.
Zurück zum Zitat Feng G, Huang GB, Lin Q, Gay R (2009) Error minimized extreme learning machine with growth of hidden nodes and incremental learning. IEEE Trans Neural Netw 20:1352–1357CrossRef Feng G, Huang GB, Lin Q, Gay R (2009) Error minimized extreme learning machine with growth of hidden nodes and incremental learning. IEEE Trans Neural Netw 20:1352–1357CrossRef
19.
Zurück zum Zitat Huang GB, Chen L (2007) Convex incremental extreme learning machine. Neurocomputing 70:3056–3062CrossRef Huang GB, Chen L (2007) Convex incremental extreme learning machine. Neurocomputing 70:3056–3062CrossRef
20.
Zurück zum Zitat Huang GB, Chen L (2008) Enhanced random search based incremental extreme learning machine. Neurocomputing 71:3460–3468CrossRef Huang GB, Chen L (2008) Enhanced random search based incremental extreme learning machine. Neurocomputing 71:3460–3468CrossRef
21.
Zurück zum Zitat Huang GB, Chen L, Siew CK (2006) Universal approximation using incremental constructive feedforward networks with random hidden nodes. IEEE Trans Neural Netw 17:879–892CrossRef Huang GB, Chen L, Siew CK (2006) Universal approximation using incremental constructive feedforward networks with random hidden nodes. IEEE Trans Neural Netw 17:879–892CrossRef
22.
Zurück zum Zitat Wang YG, Cao FL, Yuan YB (2011) A study on effectiveness of extreme learning machine. Neurocomputing 74(16):2483–2490CrossRef Wang YG, Cao FL, Yuan YB (2011) A study on effectiveness of extreme learning machine. Neurocomputing 74(16):2483–2490CrossRef
23.
Zurück zum Zitat Huang GB (2003) Learning capability and storage capacity of two-hidden-layer feedforward networks. IEEE Trans Neural Netw 14:274–281CrossRef Huang GB (2003) Learning capability and storage capacity of two-hidden-layer feedforward networks. IEEE Trans Neural Netw 14:274–281CrossRef
24.
Zurück zum Zitat Rao CR, Mitra SK (1971) Generalized inverse of matrices and its applications. Wiley, New YorkMATH Rao CR, Mitra SK (1971) Generalized inverse of matrices and its applications. Wiley, New YorkMATH
25.
Zurück zum Zitat Rätsch G, Onoda T, Müller KR (1998) An improvement of AdaBoost to avoid overfitting. In: Proceedings of the 5th international conference on neural information processing (ICONIP 1998) Rätsch G, Onoda T, Müller KR (1998) An improvement of AdaBoost to avoid overfitting. In: Proceedings of the 5th international conference on neural information processing (ICONIP 1998)
26.
Zurück zum Zitat Romero E, Alquézar R (2002) A new incremental method for function approximation using feed-forward neural networks. In: Proceedings of the 2002 international joint conference on neural networks (IJCNN’2002), pp 1968–1973 Romero E, Alquézar R (2002) A new incremental method for function approximation using feed-forward neural networks. In: Proceedings of the 2002 international joint conference on neural networks (IJCNN’2002), pp 1968–1973
27.
Zurück zum Zitat Serre D (2000) Matrices: theory and applications. Springer, New York Serre D (2000) Matrices: theory and applications. Springer, New York
29.
Zurück zum Zitat Freund Y, Schapire RE (1996) Experiments with a new boosting algorithm. In: Machine learning: proceedings of the 13th international conference, pp 148–156 Freund Y, Schapire RE (1996) Experiments with a new boosting algorithm. In: Machine learning: proceedings of the 13th international conference, pp 148–156
30.
Zurück zum Zitat Wilson DR, Martinez TR (1996, June) Heterogeneous radial basis function networks. In: IEEE international conference on neural networks (ICNN’96), pp 1263–1267 Wilson DR, Martinez TR (1996, June) Heterogeneous radial basis function networks. In: IEEE international conference on neural networks (ICNN’96), pp 1263–1267
Metadaten
Titel
A modified extreme learning machine with sigmoidal activation functions
verfasst von
Zhixiang X. Chen
Houying Y. Zhu
Yuguang G. Wang
Publikationsdatum
01.03.2013
Verlag
Springer-Verlag
Erschienen in
Neural Computing and Applications / Ausgabe 3-4/2013
Print ISSN: 0941-0643
Elektronische ISSN: 1433-3058
DOI
https://doi.org/10.1007/s00521-012-0860-2

Weitere Artikel der Ausgabe 3-4/2013

Neural Computing and Applications 3-4/2013 Zur Ausgabe

Extreme Learning Machine’s Theory & Application

An extreme learning machine approach for speaker recognition

Extreme Learning Machine’s Theory & Application

Text categorization based on regularization extreme learning machine

Premium Partner