Skip to main content
Erschienen in: Cognitive Computation 1/2019

26.09.2018

Region-Enhanced Multi-layer Extreme Learning Machine

verfasst von: Xibin Jia, Xiaobo Li, Ya Jin, Jun Miao

Erschienen in: Cognitive Computation | Ausgabe 1/2019

Einloggen

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

Deep neural networks have made significant achievements in representation learning of traditionally man-made features, especially in terms of complex objects. Over the decades, this learning process has attracted thousands of researchers and has been widely used in the speech, visual, and text recognition fields. One deep network multi-layer extreme learning machine (ML-ELM) achieves a good performance in representation learning while inheriting the advantages of faster learning and the approximating capability of the extreme learning machine (ELM). However, as with most deep networks, the ML-ELM’s algorithmic performance largely depends on the probability distribution of the training data. In this paper, we propose an improved ML-ELM made via using the local significant regions at the input end to enhance the contributions of these regions according to the idea of the selective attention mechanism. To avoid involving and exploring the complex principle of the attention system and to focus on the clarification of our local regional enhancement idea, the paper only selects two typical attention regions. One is the geometric central region, which is normally the important region to attract human attention due to the focal attention mechanism. The other is the task-driven interest region, with facial recognition as an example. The comprehensive experiments are done on the three public datasets of MNIST, NORB, and ORL. The comparison experiment results demonstrate that our proposed region-enhanced ML-ELM (RE-ML-ELM) achieves performance increases in important feature learning by utilizing the apriori knowledge of attention and has a higher recognition rate than that of the normal ML-ELM and the basic ELM. Moreover, it benefits from the non-iterative parameter training method of other ELMs, and our proposed algorithm outperforms most state-of-the-art deep networks such as deep belief network(DBN), in the aspects of training efficiency. Furthermore, because of the deep structure with fewer hidden nodes at each layer, our proposed RE-ML-ELM achieves a comparable training efficiency to that of the ML-ELM but has a higher training speed with the basic ELM, which is normally the width single network that has more hidden nodes to obtain the similar recognition accuracy with the deep networks. Based on our idea of combining the apriori knowledge of the human selective attention system with the data learning, our proposed region-enhanced ML-ELM increases the image classification performance. We believe that the idea of intentionally combining psychological knowledge with the most algorithms based on data-driven learning has the potential to improve their cognitive computing ability.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Literatur
1.
Zurück zum Zitat Krizhevsky A, Sutskever I, Hinton GE. Imagenet classification with deep convolutional neural networks. Advances in neural information processing systems (NIPS); 2012. Krizhevsky A, Sutskever I, Hinton GE. Imagenet classification with deep convolutional neural networks. Advances in neural information processing systems (NIPS); 2012.
2.
Zurück zum Zitat Bengio Y, Courville A, Vincent P. Representation learning: a review and new perspectives. IEEE Trans Pattern Anal Mach Intell 2013;35(8):1798–828.CrossRefPubMed Bengio Y, Courville A, Vincent P. Representation learning: a review and new perspectives. IEEE Trans Pattern Anal Mach Intell 2013;35(8):1798–828.CrossRefPubMed
3.
4.
Zurück zum Zitat Hinton GE, Salakhutdinov RR. Reducing the dimensionality of data with neural networks. Science 2006; 313(5786):504–7.CrossRefPubMed Hinton GE, Salakhutdinov RR. Reducing the dimensionality of data with neural networks. Science 2006; 313(5786):504–7.CrossRefPubMed
5.
Zurück zum Zitat Huang GB, Zhu QY, Siew CK. Extreme learning machine: theory and applications. Neurocomputing 2006;70:489–501.CrossRef Huang GB, Zhu QY, Siew CK. Extreme learning machine: theory and applications. Neurocomputing 2006;70:489–501.CrossRef
6.
Zurück zum Zitat Guo T, Zhang L, Tan X. Neuron pruning-based discriminative extreme learning machine for pattern classification. Cogn Comput 2017;9(4):581–595.CrossRef Guo T, Zhang L, Tan X. Neuron pruning-based discriminative extreme learning machine for pattern classification. Cogn Comput 2017;9(4):581–595.CrossRef
7.
Zurück zum Zitat Liu Y, Zhang L, Deng P, et al. Common subspace learning via cross-domain extreme learning machine. Cogn Comput 2017;9(4):555–563.CrossRef Liu Y, Zhang L, Deng P, et al. Common subspace learning via cross-domain extreme learning machine. Cogn Comput 2017;9(4):555–563.CrossRef
8.
Zurück zum Zitat Huang GB. What are extreme learning machines? Filling the gap between Frank Rosenblatt’s dream and John von Neumann’s puzzle. Cogn Comput 2015;7(3):263–78.CrossRef Huang GB. What are extreme learning machines? Filling the gap between Frank Rosenblatt’s dream and John von Neumann’s puzzle. Cogn Comput 2015;7(3):263–78.CrossRef
9.
Zurück zum Zitat Huang GB, Bai Z, Kasun LLC. Local receptive fields based extreme learning machine. IEEE Comput Intell Mag 2015;10(2):18–29.CrossRef Huang GB, Bai Z, Kasun LLC. Local receptive fields based extreme learning machine. IEEE Comput Intell Mag 2015;10(2):18–29.CrossRef
10.
Zurück zum Zitat Kasun LLC, Zhou H, Huang GB, et al. Representational learning with extreme learning machine for big data. Intell Syst IEEE 2013;28(6):31–4. Kasun LLC, Zhou H, Huang GB, et al. Representational learning with extreme learning machine for big data. Intell Syst IEEE 2013;28(6):31–4.
11.
Zurück zum Zitat Tang J, Deng C, Huang GB. Extreme learning machine for multilayer perceptron. IEEE Trans Neural Netw Learn Syst 2016;27(4):809–21.CrossRefPubMed Tang J, Deng C, Huang GB. Extreme learning machine for multilayer perceptron. IEEE Trans Neural Netw Learn Syst 2016;27(4):809–21.CrossRefPubMed
12.
Zurück zum Zitat Salakhutdinov R, Larochelle H. Efficient learning of deep boltzmann machines. International conference on artificial intelligence and statistics; 2010. Salakhutdinov R, Larochelle H. Efficient learning of deep boltzmann machines. International conference on artificial intelligence and statistics; 2010.
13.
Zurück zum Zitat Huang GB. An insight into extreme learning machines: random neurons, random features and kernels. Cogn Comput 2014;6(3):376–90.CrossRef Huang GB. An insight into extreme learning machines: random neurons, random features and kernels. Cogn Comput 2014;6(3):376–90.CrossRef
14.
Zurück zum Zitat Eriksen C, Hoffman J. Temporal and spatial characteristics of selective encoding from visual displays. Percept Psychophys. 2014;12(2B). Eriksen C, Hoffman J. Temporal and spatial characteristics of selective encoding from visual displays. Percept Psychophys. 2014;12(2B).
15.
Zurück zum Zitat Steinman BA, Steinman SB, Lehmkuhle S. Visual attention mechanisms show a center-surround organization. Vision Res 1995;35(13):1859–69.CrossRefPubMed Steinman BA, Steinman SB, Lehmkuhle S. Visual attention mechanisms show a center-surround organization. Vision Res 1995;35(13):1859–69.CrossRefPubMed
16.
Zurück zum Zitat Raftopoulos A. Cognition and perception. Oxford: Oxford University Press; 2007, pp. 5–7. Raftopoulos A. Cognition and perception. Oxford: Oxford University Press; 2007, pp. 5–7.
17.
Zurück zum Zitat Meier U, Ciresan DC, Gambardella LM, et al. Better digit recognition with a committee of simple neural nets. 2011 international conference on document analysis and recognition (ICDAR); 2011. p. 1250–4. Meier U, Ciresan DC, Gambardella LM, et al. Better digit recognition with a committee of simple neural nets. 2011 international conference on document analysis and recognition (ICDAR); 2011. p. 1250–4.
18.
Zurück zum Zitat LeCun Y, Huang FJ, Bottou L. Learning methods for generic object recognition with invariance to pose and lighting. CVPR. 2004. LeCun Y, Huang FJ, Bottou L. Learning methods for generic object recognition with invariance to pose and lighting. CVPR. 2004.
19.
Zurück zum Zitat Zhang Z, Zhao XG, Wang GR. FE-ELM: a new friend recommendation model with extreme learning machine. Cognitive Computation 2017;9(3):1–12. Zhang Z, Zhao XG, Wang GR. FE-ELM: a new friend recommendation model with extreme learning machine. Cognitive Computation 2017;9(3):1–12.
20.
Zurück zum Zitat Hinton GE, Salakhutdinov RR. Reducing the dimensionality of data with neural networks. Science 2006; 313(5786):504–7.CrossRefPubMed Hinton GE, Salakhutdinov RR. Reducing the dimensionality of data with neural networks. Science 2006; 313(5786):504–7.CrossRefPubMed
21.
Zurück zum Zitat Vincent P, Larochelle H, Lajoie I, Bengio Y, Manzagol PA. Stacked denoising autoencoders: learning useful representations in a deep network with a local denoising criterion. J Mach Learn Res 2010;11:3371–408. Vincent P, Larochelle H, Lajoie I, Bengio Y, Manzagol PA. Stacked denoising autoencoders: learning useful representations in a deep network with a local denoising criterion. J Mach Learn Res 2010;11:3371–408.
Metadaten
Titel
Region-Enhanced Multi-layer Extreme Learning Machine
verfasst von
Xibin Jia
Xiaobo Li
Ya Jin
Jun Miao
Publikationsdatum
26.09.2018
Verlag
Springer US
Erschienen in
Cognitive Computation / Ausgabe 1/2019
Print ISSN: 1866-9956
Elektronische ISSN: 1866-9964
DOI
https://doi.org/10.1007/s12559-018-9596-3

Weitere Artikel der Ausgabe 1/2019

Cognitive Computation 1/2019 Zur Ausgabe

Premium Partner