Skip to main content
Top
Published in: Neural Processing Letters 1/2018

13-11-2017

Two Novel Versions of Randomized Feed Forward Artificial Neural Networks: Stochastic and Pruned Stochastic

Author: Ömer Faruk Ertuğrul

Published in: Neural Processing Letters | Issue 1/2018

Log in

Activate our intelligent search to find suitable subject content or patents.

search-config
loading …

Abstract

Although high accuracies were achieved by artificial neural network (ANN), determining the optimal number of neurons in the hidden layer and the activation function is still an open issue. In this paper, the applicability of assigning the number of neurons in the hidden layer and the activation function randomly was investigated. Based on the findings, two novel versions of randomized ANNs, which are stochastic, and pruned stochastic, were proposed to achieve a higher accuracy without any time-consuming optimization stage. The proposed approaches were evaluated and validated by the basic versions of the popular randomized ANNs [1] are the random weight neural network [2], the random vector functional links [3] and the extreme learning machine [4] methods. In the stochastic version of randomized ANNs, not only the weights and biases of the neurons in the hidden layer but also the number of neurons in the hidden layer and each activation function were assigned randomly. In pruned stochastic version of these methods, the winner networks were pruned according to a novel strategy in order to produce a faster response. Proposed approaches were validated via 60 datasets (30 classification and 30 regression datasets). Obtained accuracies and time usages showed that both versions of randomized ANNs can be employed for classification and regression.

Dont have a licence yet? Then find out more about our products and how to get one now:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Appendix
Available only for authorised users
Literature
1.
go back to reference Zhang L, Suganthan PN (2016) A survey of randomized algorithms for training neural networks. Inf Sci 364:146–155CrossRef Zhang L, Suganthan PN (2016) A survey of randomized algorithms for training neural networks. Inf Sci 364:146–155CrossRef
2.
go back to reference Schmidt WF, Kraaijveld MA, Duin RPW (1992) Feed forward neural networks with random weights. In: IEEE, 11th international conference on pattern recognition, pp 1–4 Schmidt WF, Kraaijveld MA, Duin RPW (1992) Feed forward neural networks with random weights. In: IEEE, 11th international conference on pattern recognition, pp 1–4
3.
go back to reference Pao YH, Park GH, Sobajic DJ (1994) Learning and generalization characteristics of the random vector functional-link net. Neurocomputing 6(2):163–180CrossRef Pao YH, Park GH, Sobajic DJ (1994) Learning and generalization characteristics of the random vector functional-link net. Neurocomputing 6(2):163–180CrossRef
4.
go back to reference Huang G-B, Zhu Q, Siew CÃ (2006) Extreme learning machine: theory and applications. Neurocomputing 70(1–3):489–501CrossRef Huang G-B, Zhu Q, Siew CÃ (2006) Extreme learning machine: theory and applications. Neurocomputing 70(1–3):489–501CrossRef
6.
go back to reference Wang D (2016) Editorial?: Randomized algorithms for training neural networks. Inf Sci 364–365:126–128CrossRef Wang D (2016) Editorial?: Randomized algorithms for training neural networks. Inf Sci 364–365:126–128CrossRef
7.
go back to reference Li M, Wang D (2017) Insights into randomized algorithms for neural networks: practical issues and common pitfalls. Inf Sci 382–383(December):170–178CrossRef Li M, Wang D (2017) Insights into randomized algorithms for neural networks: practical issues and common pitfalls. Inf Sci 382–383(December):170–178CrossRef
8.
go back to reference Zhang L, Suganthan PN (2016) A comprehensive evaluation of random vector functional link networks. Inf Sci 367–368:1094–1105CrossRef Zhang L, Suganthan PN (2016) A comprehensive evaluation of random vector functional link networks. Inf Sci 367–368:1094–1105CrossRef
9.
go back to reference Huang G, Bin Huang G, Song S, You K (2015) Trends in extreme learning machines: a review. Neural Netw 61:32–48CrossRefMATH Huang G, Bin Huang G, Song S, You K (2015) Trends in extreme learning machines: a review. Neural Netw 61:32–48CrossRefMATH
10.
go back to reference Bin Huang G (2015) What are extreme learning machines? filling the gap between Frank Rosenblatt’s dream and John von Neumann’s puzzle. Cogn Comput 7(3):263–278CrossRef Bin Huang G (2015) What are extreme learning machines? filling the gap between Frank Rosenblatt’s dream and John von Neumann’s puzzle. Cogn Comput 7(3):263–278CrossRef
11.
go back to reference Bin Huang G (2014) An insight into extreme learning machines: random neurons, random features and kernels. Cogn Comput 6(3):376–390CrossRef Bin Huang G (2014) An insight into extreme learning machines: random neurons, random features and kernels. Cogn Comput 6(3):376–390CrossRef
12.
go back to reference Hernández-Aguirre A, Koutsougeras C, Buckles BP (2002) Sample complexity for function learning tasks through linear neural networks. Lect Notes Comput Sci 2313:262–271MathSciNetCrossRefMATH Hernández-Aguirre A, Koutsougeras C, Buckles BP (2002) Sample complexity for function learning tasks through linear neural networks. Lect Notes Comput Sci 2313:262–271MathSciNetCrossRefMATH
13.
go back to reference Porwal A, Carranza EJM, Hale M (2003) Artificial neural networks for mineral-potential mapping: a case study from Aravalli Province, Western India. Nat Resour Res 12(3):155–171CrossRef Porwal A, Carranza EJM, Hale M (2003) Artificial neural networks for mineral-potential mapping: a case study from Aravalli Province, Western India. Nat Resour Res 12(3):155–171CrossRef
14.
go back to reference Das DP, Panda G (2004) Active mitigation of nonlinear noise processes using a novel filtered-s LMS algorithm. IEEE Trans Speech Audio Process 12(3):313–322CrossRef Das DP, Panda G (2004) Active mitigation of nonlinear noise processes using a novel filtered-s LMS algorithm. IEEE Trans Speech Audio Process 12(3):313–322CrossRef
15.
go back to reference Dudek G (2016) Extreme learning machine as a function approximator: initialization of input weights and biases. Adv Intell Syst Comput 403:59–69 Dudek G (2016) Extreme learning machine as a function approximator: initialization of input weights and biases. Adv Intell Syst Comput 403:59–69
16.
go back to reference Ertuğrul ÖF (2016) Forecasting electricity load by a novel recurrent extreme learning machines approach. Int J Electr Power Energy Syst 78:429CrossRef Ertuğrul ÖF (2016) Forecasting electricity load by a novel recurrent extreme learning machines approach. Int J Electr Power Energy Syst 78:429CrossRef
17.
go back to reference Ertuğrul ÖF, Kaya Y (2016) Smart city planning by estimating energy efficiency of buildings by extreme learning machine. In: 4th international Istanbul smart grid congress and fair, ICSG 2016 Ertuğrul ÖF, Kaya Y (2016) Smart city planning by estimating energy efficiency of buildings by extreme learning machine. In: 4th international Istanbul smart grid congress and fair, ICSG 2016
18.
go back to reference Ertuğrul ÖF, Emin Tağluk M, Kaya Y, Tekin R (2013) EMG signal classification by extreme learning machine|EMG sinyallerinin agirigrenme makinesi ile siniflandirilmasi. In: 21st signal processing and communications applications conference SIU 2013 Ertuğrul ÖF, Emin Tağluk M, Kaya Y, Tekin R (2013) EMG signal classification by extreme learning machine|EMG sinyallerinin agirigrenme makinesi ile siniflandirilmasi. In: 21st signal processing and communications applications conference SIU 2013
19.
go back to reference Li B, Li Y, Rong X (2013) The extreme learning machine learning algorithm with tunable activation function. Neural Comput Appl 22(3–4):531–539CrossRef Li B, Li Y, Rong X (2013) The extreme learning machine learning algorithm with tunable activation function. Neural Comput Appl 22(3–4):531–539CrossRef
20.
go back to reference Rychetsky M, Ortmann M, Glesner S (1998) Pruning and regularization techniques for feed forward nets applied on a real world data base. In: International symposium on neural computation, pp 603–609 Rychetsky M, Ortmann M, Glesner S (1998) Pruning and regularization techniques for feed forward nets applied on a real world data base. In: International symposium on neural computation, pp 603–609
21.
go back to reference Huang G-B, Chen L, Siew C-K (2006) Universal approximation using incremental constructive feedforward networks with random hidden nodes. Trans Neural Netw 17(4):879–892CrossRef Huang G-B, Chen L, Siew C-K (2006) Universal approximation using incremental constructive feedforward networks with random hidden nodes. Trans Neural Netw 17(4):879–892CrossRef
22.
go back to reference Huang G-B, Chen L (2008) Enhanced random search based incremental extreme learning machine. Neurocomputing 71(16–18):3460–3468CrossRef Huang G-B, Chen L (2008) Enhanced random search based incremental extreme learning machine. Neurocomputing 71(16–18):3460–3468CrossRef
23.
24.
go back to reference Duin RPW, Juszczak P, Paclik P, Pekalska E, de Ridder D (2004) PR-Tools 4.0, a Matlab toolbox for pattern recognition, The Netherlands Duin RPW, Juszczak P, Paclik P, Pekalska E, de Ridder D (2004) PR-Tools 4.0, a Matlab toolbox for pattern recognition, The Netherlands
27.
go back to reference Huang G, Zhu Q (2006) Extreme learning machine: a new learning scheme of feedforward neural networks. Neurocomputing 70:489–501CrossRef Huang G, Zhu Q (2006) Extreme learning machine: a new learning scheme of feedforward neural networks. Neurocomputing 70:489–501CrossRef
29.
go back to reference Ertuğrul ÖF, Altun Ş (2016) Developing correlations by extreme learning machine for calculating higher heating values of waste frying oils from their physical properties. Neural Comput Appl 28:3145CrossRef Ertuğrul ÖF, Altun Ş (2016) Developing correlations by extreme learning machine for calculating higher heating values of waste frying oils from their physical properties. Neural Comput Appl 28:3145CrossRef
30.
go back to reference Zhu Q-Y, Qin AK, Suganthan PN, Huang G-B (2005) Evolutionary extreme learning machine. Pattern Recogn 38(10):1759–1763CrossRefMATH Zhu Q-Y, Qin AK, Suganthan PN, Huang G-B (2005) Evolutionary extreme learning machine. Pattern Recogn 38(10):1759–1763CrossRefMATH
31.
go back to reference Bin Huang G, Wang DH, Lan Y (2011) Extreme learning machines: a survey. Int J Mach Learn Cybern 2(2):107–122CrossRef Bin Huang G, Wang DH, Lan Y (2011) Extreme learning machines: a survey. Int J Mach Learn Cybern 2(2):107–122CrossRef
32.
go back to reference Feng G, Bin Huang G, Lin Q, Gay R (2009) Error minimized extreme learning machine with growth of hidden nodes and incremental learning. IEEE Trans Neural Netw 20(8):1352–1357CrossRef Feng G, Bin Huang G, Lin Q, Gay R (2009) Error minimized extreme learning machine with growth of hidden nodes and incremental learning. IEEE Trans Neural Netw 20(8):1352–1357CrossRef
33.
go back to reference Cant E (2003) Pruning neural networks with distribution estimation algorithms. Neural Netw 25:790–800MATH Cant E (2003) Pruning neural networks with distribution estimation algorithms. Neural Netw 25:790–800MATH
34.
go back to reference Bin Huang G, Chen L (2007) Convex incremental extreme learning machine. Neurocomputing 70(16–18):3056–3062CrossRef Bin Huang G, Chen L (2007) Convex incremental extreme learning machine. Neurocomputing 70(16–18):3056–3062CrossRef
35.
go back to reference Rong H-J, Ong Y-S, Tan A-H, Zhu Z (2008) A fast pruned-extreme learning machine for classification problem. Neurocomputing 72(1–3):359–366CrossRef Rong H-J, Ong Y-S, Tan A-H, Zhu Z (2008) A fast pruned-extreme learning machine for classification problem. Neurocomputing 72(1–3):359–366CrossRef
36.
go back to reference Alpaydın E (2010) Introduction to machine learning, 2nd edn. The MIT Press, Cambridge, MA, London, England, pp 39, 79 Alpaydın E (2010) Introduction to machine learning, 2nd edn. The MIT Press, Cambridge, MA, London, England, pp 39, 79
37.
go back to reference Ratsch G, Onoda T, Muller KR (1998) An improvement of AdaBoost to avoid overfitting. In: Advances in neutral information processing systems, Kitakyushu, pp 506–509 Ratsch G, Onoda T, Muller KR (1998) An improvement of AdaBoost to avoid overfitting. In: Advances in neutral information processing systems, Kitakyushu, pp 506–509
38.
go back to reference Raymer ML, Doom TE, Kuhn LA, Punch WF (2003) Knowledge discovery in medical and biological datasets using a hybrid bayes classifier/evolutionary algorithm. IEEE Trans Syst Man Cybern 33:802–814CrossRef Raymer ML, Doom TE, Kuhn LA, Punch WF (2003) Knowledge discovery in medical and biological datasets using a hybrid bayes classifier/evolutionary algorithm. IEEE Trans Syst Man Cybern 33:802–814CrossRef
39.
go back to reference Ertuğrul ÖF, Tağluk ME (2016) A novel machine learning method based on generalized behavioral learning theory. Neural Comput Appl 28:3921CrossRef Ertuğrul ÖF, Tağluk ME (2016) A novel machine learning method based on generalized behavioral learning theory. Neural Comput Appl 28:3921CrossRef
40.
go back to reference Demšar J (2006) Statistical comparisons of classifiers over multiple data sets. J Mach Learn Res 7:1–30MathSciNetMATH Demšar J (2006) Statistical comparisons of classifiers over multiple data sets. J Mach Learn Res 7:1–30MathSciNetMATH
41.
go back to reference Tang J, Deng C, Huang G-B (2015) Extreme learning machine for multilayer perceptron. IEEE Trans Neural Netw Learn Syst 27:1–13MathSciNet Tang J, Deng C, Huang G-B (2015) Extreme learning machine for multilayer perceptron. IEEE Trans Neural Netw Learn Syst 27:1–13MathSciNet
42.
go back to reference Liang N-Y, Huang G-B, Saratchandran P, Sundararajan N (2006) A fast and accurate online sequential learning algorithm for feedforward networks. IEEE Trans Neural Netw 17(6):1411–1423CrossRef Liang N-Y, Huang G-B, Saratchandran P, Sundararajan N (2006) A fast and accurate online sequential learning algorithm for feedforward networks. IEEE Trans Neural Netw 17(6):1411–1423CrossRef
43.
go back to reference Cao J, Lin Z, Huang GB (2012) Self-adaptive evolutionary extreme learning machine. Neural Process Lett 36(3):285–305CrossRef Cao J, Lin Z, Huang GB (2012) Self-adaptive evolutionary extreme learning machine. Neural Process Lett 36(3):285–305CrossRef
44.
go back to reference Wang Y, Yuan Y, Yang X (2012) Bidirectional extreme learning machine for regression problem and its learning effectiveness. IEEE Trans Neural Netw Learn Syst 23(9):1498–1505CrossRef Wang Y, Yuan Y, Yang X (2012) Bidirectional extreme learning machine for regression problem and its learning effectiveness. IEEE Trans Neural Netw Learn Syst 23(9):1498–1505CrossRef
45.
go back to reference Alamo T, Tempo R, Luque A, Ramirez DR (2015) Randomized methods for design of uncertain systems: sample complexity and sequential algorithms. Automatica 52:160–172MathSciNetCrossRefMATH Alamo T, Tempo R, Luque A, Ramirez DR (2015) Randomized methods for design of uncertain systems: sample complexity and sequential algorithms. Automatica 52:160–172MathSciNetCrossRefMATH
Metadata
Title
Two Novel Versions of Randomized Feed Forward Artificial Neural Networks: Stochastic and Pruned Stochastic
Author
Ömer Faruk Ertuğrul
Publication date
13-11-2017
Publisher
Springer US
Published in
Neural Processing Letters / Issue 1/2018
Print ISSN: 1370-4621
Electronic ISSN: 1573-773X
DOI
https://doi.org/10.1007/s11063-017-9752-x

Other articles of this Issue 1/2018

Neural Processing Letters 1/2018 Go to the issue