Skip to main content
Erschienen in: International Journal of Machine Learning and Cybernetics 7/2022

11.03.2022 | Original Article

Self-stacking random weight neural network with multi-layer features fusion

verfasst von: Aimin Fu, Jingna Liu, Tian-Lun Zhang

Erschienen in: International Journal of Machine Learning and Cybernetics | Ausgabe 7/2022

Einloggen

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

Randomized methods are practical and efficient for training the connectionist models. In this paper, we contribute to develop a self-stacking random weight neural network. Two different methods of feature fusion are proposed in this paper. The first one inter-connects the coarse and high level features to make the classification decision more diverse by using the proposed hierarchical network architecture with dense connectivity. On the other hand, the different decisions all-throughout the network are incorporated by a novel non-linear ensemble learning in an end-to-end manner. Through experiments, we verified the effectiveness of random features fusion, and even if each hierarchical branch in the network has very unfavorable accuracy, the proposed ensemble learning presents the impressive performance to boost the classification results. Moreover, the proposed connectionist model is applied to address one practice engineering problem of gearbox fault diagnosis, and the simulation demonstrates that our method has better robust to the noise in vibration signal of working gearbox.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Weitere Produktempfehlungen anzeigen
Literatur
1.
Zurück zum Zitat Zhang T, Yang X, Wang X, Wan R (2020) Deep joint neural model for single image haze removal and color correction. Inf Sci 541:16–35MathSciNetCrossRef Zhang T, Yang X, Wang X, Wan R (2020) Deep joint neural model for single image haze removal and color correction. Inf Sci 541:16–35MathSciNetCrossRef
2.
Zurück zum Zitat Yu D, Hinton G, Morgan N, Chien J-T, Sagayama S (2012) Introduction to the special section on deep learning for speech and language processing. IEEE Trans Audio Speech Lang Process 20:4–6CrossRef Yu D, Hinton G, Morgan N, Chien J-T, Sagayama S (2012) Introduction to the special section on deep learning for speech and language processing. IEEE Trans Audio Speech Lang Process 20:4–6CrossRef
3.
Zurück zum Zitat Khobahi S, Soltanalian M (2020) Model-based deep learning for one-bit compressive sensing. IEEE Trans Signal Process 68:5292–5307MathSciNetCrossRef Khobahi S, Soltanalian M (2020) Model-based deep learning for one-bit compressive sensing. IEEE Trans Signal Process 68:5292–5307MathSciNetCrossRef
4.
Zurück zum Zitat Fang F, Li J, Zeng T (2020) Soft-edge assisted network for single image super-resolution. IEEE Trans Image Process 29:4656–4668CrossRef Fang F, Li J, Zeng T (2020) Soft-edge assisted network for single image super-resolution. IEEE Trans Image Process 29:4656–4668CrossRef
5.
6.
Zurück zum Zitat Hecht-Nielsen R (1988) Theory of the backpropagation neural network. Neural networks 1(1):445 Hecht-Nielsen R (1988) Theory of the backpropagation neural network. Neural networks 1(1):445
7.
Zurück zum Zitat Zhang C, Chen C, Chen D, K N (2016) Mapreduce based distributed learning algorithm for restricted Boltzmann machine. Neurocomputing 198:4–11CrossRef Zhang C, Chen C, Chen D, K N (2016) Mapreduce based distributed learning algorithm for restricted Boltzmann machine. Neurocomputing 198:4–11CrossRef
8.
Zurück zum Zitat Hinton G, Osindero S, Teh Y-W (2006) A fast learning algorithm for deep belief nets. Neural Comput 18:1527–1554MathSciNetCrossRef Hinton G, Osindero S, Teh Y-W (2006) A fast learning algorithm for deep belief nets. Neural Comput 18:1527–1554MathSciNetCrossRef
9.
Zurück zum Zitat Srivastava N, Hinton G, Krizhevsky A, Sutskever I, Salakhutdinov R (2014) Dropout: a simple way to prevent neural networks from overfitting. J Mach Learn Res 15(1):1929–1958MathSciNetMATH Srivastava N, Hinton G, Krizhevsky A, Sutskever I, Salakhutdinov R (2014) Dropout: a simple way to prevent neural networks from overfitting. J Mach Learn Res 15(1):1929–1958MathSciNetMATH
10.
Zurück zum Zitat Ioffe S, Szegedy C ( 2015) Batch normalization: Accelerating deep network training by reducing internal covariate shift. arXiv preprint. arXiv:1502.03167 Ioffe S, Szegedy C ( 2015) Batch normalization: Accelerating deep network training by reducing internal covariate shift. arXiv preprint. arXiv:​1502.​03167
11.
Zurück zum Zitat Huang G, Chen L, Siew C et al (2006) Universal approximation using incremental constructive feedforward networks with random hidden nodes. IEEE Trans Neural Netw 17(4):879–892CrossRef Huang G, Chen L, Siew C et al (2006) Universal approximation using incremental constructive feedforward networks with random hidden nodes. IEEE Trans Neural Netw 17(4):879–892CrossRef
12.
Zurück zum Zitat Pao Y, Takefuji Y (1992) Functional-link net computing: theory, system architecture, and functionalities. Computer 25(5):76–79CrossRef Pao Y, Takefuji Y (1992) Functional-link net computing: theory, system architecture, and functionalities. Computer 25(5):76–79CrossRef
13.
Zurück zum Zitat Huang G, Zhu Q, Siew C (2004) Extreme learning machine: a new learning scheme of feedforward neural networks. In: Proceedings of the IEEE international joint conference on neural networks, pp 985–990 Huang G, Zhu Q, Siew C (2004) Extreme learning machine: a new learning scheme of feedforward neural networks. In: Proceedings of the IEEE international joint conference on neural networks, pp 985–990
14.
Zurück zum Zitat Dai P, Gwadry-Sridhar F, Bauer M, Borrie M, X T (2017) Healthy cognitive aging: a hybrid random vector functional-link model for the analysis of Alzheimer’s disease. In: Proceedings of the AAAI, pp 4567–4573 Dai P, Gwadry-Sridhar F, Bauer M, Borrie M, X T (2017) Healthy cognitive aging: a hybrid random vector functional-link model for the analysis of Alzheimer’s disease. In: Proceedings of the AAAI, pp 4567–4573
15.
Zurück zum Zitat Li H, Yang X, Li Y, Hao L, Zhang T (2020) Evolutionary extreme learning machine with sparse cost matrix for imbalanced learning. ISA Trans 100:198–209CrossRef Li H, Yang X, Li Y, Hao L, Zhang T (2020) Evolutionary extreme learning machine with sparse cost matrix for imbalanced learning. ISA Trans 100:198–209CrossRef
16.
Zurück zum Zitat W X-Z, W R (2017) Noniterative deep learning: incorporating restricted Boltzmann machine into multilayer random weight neural networks. IEEE Trans Syst Man Cybern Syst 99:1–10 W X-Z, W R (2017) Noniterative deep learning: incorporating restricted Boltzmann machine into multilayer random weight neural networks. IEEE Trans Syst Man Cybern Syst 99:1–10
17.
Zurück zum Zitat C CP, L Z (2018) Broad learning system: an effective and efficient incremental learning system without the need for deep architecture. IEEE Trans Neural Netw Learn Syst 29(1):10–24MathSciNetCrossRef C CP, L Z (2018) Broad learning system: an effective and efficient incremental learning system without the need for deep architecture. IEEE Trans Neural Netw Learn Syst 29(1):10–24MathSciNetCrossRef
18.
Zurück zum Zitat Zhang T, Yang X, Chen R, Guo S (2019) Rich feature combination for cost-based broad learning system. IEEE Access 7:160–172CrossRef Zhang T, Yang X, Chen R, Guo S (2019) Rich feature combination for cost-based broad learning system. IEEE Access 7:160–172CrossRef
19.
Zurück zum Zitat Tang J, Deng C, Huang G (2016) Extreme learning machine for multilayer perceptron. IEEE Trans Neural Netw Learn Syst 27(4):809–821MathSciNetCrossRef Tang J, Deng C, Huang G (2016) Extreme learning machine for multilayer perceptron. IEEE Trans Neural Netw Learn Syst 27(4):809–821MathSciNetCrossRef
20.
Zurück zum Zitat Cai Y, Liu X, Zhang Y, Cai Z (2018) Hierarchical ensemble of extreme learning machine. Pattern Recogn Lett 116:101–106CrossRef Cai Y, Liu X, Zhang Y, Cai Z (2018) Hierarchical ensemble of extreme learning machine. Pattern Recogn Lett 116:101–106CrossRef
21.
Zurück zum Zitat Wang X-B, Zhang Z, Li Xiaoyuanand, Wu J (2020) Ensemble extreme learning machines for compound-fault diagnosis of rotating machinery. Knowl Based Syst 188:9–12 Wang X-B, Zhang Z, Li Xiaoyuanand, Wu J (2020) Ensemble extreme learning machines for compound-fault diagnosis of rotating machinery. Knowl Based Syst 188:9–12
22.
Zurück zum Zitat Zhang L, Suganthan PN (2017) Benchmarking ensemble classifiers with novel co-trained kernel ridge regression and random vector functional link ensembles. IEEE Comput Intell Mag 12:61–72CrossRef Zhang L, Suganthan PN (2017) Benchmarking ensemble classifiers with novel co-trained kernel ridge regression and random vector functional link ensembles. IEEE Comput Intell Mag 12:61–72CrossRef
23.
Zurück zum Zitat Qiu X, Suganthan PN, Amaratunga GA (2018) Ensemble incremental learning random vector functional link network for short-term electric load forecasting. Knowl Based Syst 145:182–196CrossRef Qiu X, Suganthan PN, Amaratunga GA (2018) Ensemble incremental learning random vector functional link network for short-term electric load forecasting. Knowl Based Syst 145:182–196CrossRef
24.
Zurück zum Zitat AdeniyiAlaba P, IsaiahPopoola S et al (2019) Towards a more efficient and cost-sensitive extreme learning machine: a state-of-the-art review of recent trend. Neurocomputing 350:70–90CrossRef AdeniyiAlaba P, IsaiahPopoola S et al (2019) Towards a more efficient and cost-sensitive extreme learning machine: a state-of-the-art review of recent trend. Neurocomputing 350:70–90CrossRef
25.
Zurück zum Zitat Guo Y, Xu W (2016) Attribute reduction in multi-source decision systems. In: International joint conference on rough sets, pp 558–568 Guo Y, Xu W (2016) Attribute reduction in multi-source decision systems. In: International joint conference on rough sets, pp 558–568
26.
Zurück zum Zitat Hu M, Tsang EC, Guo Y, Xu W (2021) Fast and robust attribute reduction based on the separability in fuzzy decision systems. IEEE Trans Cybern 1–14 (Early Access) Hu M, Tsang EC, Guo Y, Xu W (2021) Fast and robust attribute reduction based on the separability in fuzzy decision systems. IEEE Trans Cybern 1–14 (Early Access)
27.
Zurück zum Zitat Hu M, Tsang EC, Guo Y, Chen D, Xu W (2021) A novel approach to attribute reduction based on weighted neighborhood rough sets. Knowl Based Syst 220:1–12 Hu M, Tsang EC, Guo Y, Chen D, Xu W (2021) A novel approach to attribute reduction based on weighted neighborhood rough sets. Knowl Based Syst 220:1–12
28.
Zurück zum Zitat Wang X, Shao Q, Qing M, Zhai J (2013) Architecture selection for networks trained with extreme learning machine using localized generalization error model. Neurocomputing 102:3–9CrossRef Wang X, Shao Q, Qing M, Zhai J (2013) Architecture selection for networks trained with extreme learning machine using localized generalization error model. Neurocomputing 102:3–9CrossRef
29.
Zurück zum Zitat Zhang T, Li Y, Chen R (2020) Gaussian prior based adaptive synthetic sampling with non-linear sample space for imbalanced learning. Knowl Based Syst 191:1–10 Zhang T, Li Y, Chen R (2020) Gaussian prior based adaptive synthetic sampling with non-linear sample space for imbalanced learning. Knowl Based Syst 191:1–10
30.
Zurück zum Zitat Huang G, Liu Z et al (2017) Densely connected convolutional networks. In: Proceedings of the 2017 IEEE conference on computer vision and pattern recognition, pp 2261–2269 Huang G, Liu Z et al (2017) Densely connected convolutional networks. In: Proceedings of the 2017 IEEE conference on computer vision and pattern recognition, pp 2261–2269
31.
Zurück zum Zitat Zhang L, Suganthan P (2016) A comprehensive evaluation of random vector functional link networks. Inf Sci 367:1094–1105CrossRef Zhang L, Suganthan P (2016) A comprehensive evaluation of random vector functional link networks. Inf Sci 367:1094–1105CrossRef
32.
Zurück zum Zitat Cao W et al (2020) A study on the relationship between the rank of input data and the performance of random weight neural network. Neural Comput Appl 32:12 685-12 696 Cao W et al (2020) A study on the relationship between the rank of input data and the performance of random weight neural network. Neural Comput Appl 32:12 685-12 696
33.
Zurück zum Zitat Liu J, Hao R, Zhang T, Wang X (2021) Vibration fault diagnosis based on stochastic configuration neural networks. Neurocomputing 434:98–125CrossRef Liu J, Hao R, Zhang T, Wang X (2021) Vibration fault diagnosis based on stochastic configuration neural networks. Neurocomputing 434:98–125CrossRef
34.
Zurück zum Zitat Zhai J, Xu H, Li Y (2013) Fusion of extreme learning machine with fuzzy integral. Int J Uncertain Fuzziness Knowl Based Syst 21(2):23–34MathSciNetCrossRef Zhai J, Xu H, Li Y (2013) Fusion of extreme learning machine with fuzzy integral. Int J Uncertain Fuzziness Knowl Based Syst 21(2):23–34MathSciNetCrossRef
35.
Zurück zum Zitat Dong X, Yu Z et al (2020) A survey on ensemble learning. Front Comput Sci 14:241–258CrossRef Dong X, Yu Z et al (2020) A survey on ensemble learning. Front Comput Sci 14:241–258CrossRef
36.
Zurück zum Zitat Yun D, Park H, Koo J-H, Ham S, Lee S (2015) Investigation of heat treatment of gears using a simultaneous dual frequency induction heating method. IEEE Trans Magn 51:1–4CrossRef Yun D, Park H, Koo J-H, Ham S, Lee S (2015) Investigation of heat treatment of gears using a simultaneous dual frequency induction heating method. IEEE Trans Magn 51:1–4CrossRef
37.
Zurück zum Zitat Chen X, F ZP (2017) Time–frequency analysis of torsional vibration signals in resonance region for planetary gearbox fault diagnosis under variable speed conditions,. IEEE Access 5:21 918-21 926CrossRef Chen X, F ZP (2017) Time–frequency analysis of torsional vibration signals in resonance region for planetary gearbox fault diagnosis under variable speed conditions,. IEEE Access 5:21 918-21 926CrossRef
38.
Zurück zum Zitat Jia F, Lei Y, Lin J, Zhou X, N L (2016) Deep neural networks: a promising tool for fault characteristic mining and intelligent diagnosis of rotating machinery with massive data. Mech Syst Signal Process 72:303–315CrossRef Jia F, Lei Y, Lin J, Zhou X, N L (2016) Deep neural networks: a promising tool for fault characteristic mining and intelligent diagnosis of rotating machinery with massive data. Mech Syst Signal Process 72:303–315CrossRef
39.
Zurück zum Zitat Shao H, Jiang H, Zhao H, Wang F (2017) A novel deep autoencoder feature learning method for rotating machinery fault diagnosis. Mech Syst Signal Process 95:187–204CrossRef Shao H, Jiang H, Zhao H, Wang F (2017) A novel deep autoencoder feature learning method for rotating machinery fault diagnosis. Mech Syst Signal Process 95:187–204CrossRef
40.
Zurück zum Zitat Lei Y, Yang B, Du Z, N L (2019) Deep transfer diagnosis method for machinery in big data era. J Mech Eng 55:1–8 Lei Y, Yang B, Du Z, N L (2019) Deep transfer diagnosis method for machinery in big data era. J Mech Eng 55:1–8
41.
Zurück zum Zitat Zhai J, Zhang S, Zhang M, Liu X (2018) Fuzzy integral-based elm ensemble for imbalanced big data classification. Soft Comput 22(11):3519–3531CrossRef Zhai J, Zhang S, Zhang M, Liu X (2018) Fuzzy integral-based elm ensemble for imbalanced big data classification. Soft Comput 22(11):3519–3531CrossRef
Metadaten
Titel
Self-stacking random weight neural network with multi-layer features fusion
verfasst von
Aimin Fu
Jingna Liu
Tian-Lun Zhang
Publikationsdatum
11.03.2022
Verlag
Springer Berlin Heidelberg
Erschienen in
International Journal of Machine Learning and Cybernetics / Ausgabe 7/2022
Print ISSN: 1868-8071
Elektronische ISSN: 1868-808X
DOI
https://doi.org/10.1007/s13042-021-01498-z

Weitere Artikel der Ausgabe 7/2022

International Journal of Machine Learning and Cybernetics 7/2022 Zur Ausgabe