Skip to main content
Top
Published in: Cognitive Computation 5/2018

11-05-2018

Conditional Random Mapping for Effective ELM Feature Representation

Authors: Cheng Li, Chenwei Deng, Shichao Zhou, Baojun Zhao, Guang-Bin Huang

Published in: Cognitive Computation | Issue 5/2018

Log in

Activate our intelligent search to find suitable subject content or patents.

search-config
loading …

Abstract

Extreme learning machine (ELM) has been extensively studied, due to its fast training and good generalization. Unfortunately, the existing ELM-based feature representation methods are uncompetitive with state-of-the-art deep neural networks (DNNs) when conducting some complex visual recognition tasks. This weakness is mainly caused by two critical defects: (1) random feature mappings (RFM) by ad hoc probability distribution is unable to well project various input data into discriminative feature spaces; (2) in the ELM-based hierarchical architectures, features from previous layer are scattered via RFM in the current layer, which leads to abstracting higher level features ineffectively. To address these issues, we aim to take advantage of label information for optimizing random mapping in the ELM, utilizing an efficient label alignment metric to learn a conditional random feature mapping (CRFM) in a supervised manner. Moreover, we proposed a new CRFM-based single-layer ELM (CELM) and then extended CELM to the supervised multi-layer learning architecture (ML-CELM). Extensive experiments on various widely used datasets demonstrate our approach is more effective than original ELM-based and other existing DNN feature representation methods with rapid training/testing speed. The proposed CELM and ML-CELM are able to achieve discriminative and robust feature representation, and have shown superiority in various simulations in terms of generalization and speed.

Dont have a licence yet? Then find out more about our products and how to get one now:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Literature
1.
go back to reference Huang G-B, Zhu QY, Siew CK. Extreme learning machine: theory and applications. Neurocomputing. 2006;70(13):489–501.CrossRef Huang G-B, Zhu QY, Siew CK. Extreme learning machine: theory and applications. Neurocomputing. 2006;70(13):489–501.CrossRef
2.
go back to reference Huang G-B, Zhou H, Ding X, Zhang R. Extreme learning machine for regression and multiclass classification. IEEE Trans Syst Man Cybern B Cybern. 2012;42(2):513.CrossRef Huang G-B, Zhou H, Ding X, Zhang R. Extreme learning machine for regression and multiclass classification. IEEE Trans Syst Man Cybern B Cybern. 2012;42(2):513.CrossRef
3.
go back to reference Savitha R, Suresh S, Kim HJ. A meta-cognitive learning algorithm for an extreme learning machine classifier. Cogn Comput. 2014;6(2):253–63.CrossRef Savitha R, Suresh S, Kim HJ. A meta-cognitive learning algorithm for an extreme learning machine classifier. Cogn Comput. 2014;6(2):253–63.CrossRef
4.
go back to reference Huang G-B, Song S, You K. Trends in extreme learning machines: a review. Neural Netw Offic J Int Neural Netw Soc. 2015;61(C):32–48.CrossRef Huang G-B, Song S, You K. Trends in extreme learning machines: a review. Neural Netw Offic J Int Neural Netw Soc. 2015;61(C):32–48.CrossRef
5.
go back to reference Huang G-B. What are extreme learning machines? Filling the gap between Frank Rosenblatts dream and John von Neumanns puzzle. Cogn Comput. 2015;7:263–78.CrossRef Huang G-B. What are extreme learning machines? Filling the gap between Frank Rosenblatts dream and John von Neumanns puzzle. Cogn Comput. 2015;7:263–78.CrossRef
6.
go back to reference Huang G-B, Chen L. Letters: Convex incremental extreme learning machine. Neurocomputing. 2012;70(16):3056–62. Huang G-B, Chen L. Letters: Convex incremental extreme learning machine. Neurocomputing. 2012;70(16):3056–62.
7.
go back to reference Huang G-B. An insight into extreme learning machines: random neurons, random features and kernels. Cogn Comput. 2014;6(3):376–90.CrossRef Huang G-B. An insight into extreme learning machines: random neurons, random features and kernels. Cogn Comput. 2014;6(3):376–90.CrossRef
8.
go back to reference Cao J, Zhang K, Luo M, Yin C, Lai X. Extreme learning machine and adaptive sparse representation for image classification. Neural Netw. 2016;81:91–102.CrossRef Cao J, Zhang K, Luo M, Yin C, Lai X. Extreme learning machine and adaptive sparse representation for image classification. Neural Netw. 2016;81:91–102.CrossRef
9.
go back to reference Iosifidis A, Tefas A, Pitas I. Graph embedded extreme learning machine. IEEE Trans Cybern. 2016;46(1):311–24.CrossRef Iosifidis A, Tefas A, Pitas I. Graph embedded extreme learning machine. IEEE Trans Cybern. 2016;46(1):311–24.CrossRef
10.
go back to reference Huang G-B, Wang DH, Lan Y. Extreme learning machines: a survey. Int J Mach Learn Cybern. 2011;2(2):107–22.CrossRef Huang G-B, Wang DH, Lan Y. Extreme learning machines: a survey. Int J Mach Learn Cybern. 2011;2(2):107–22.CrossRef
11.
go back to reference Lin SB, Liu X, Fang J, Xu ZB. Is extreme learning machine feasible? A theoretical assessment (part ii). IEEE Trans Neural Netw Learn Syst. 2014;26(1):21–34.CrossRef Lin SB, Liu X, Fang J, Xu ZB. Is extreme learning machine feasible? A theoretical assessment (part ii). IEEE Trans Neural Netw Learn Syst. 2014;26(1):21–34.CrossRef
12.
go back to reference Wang XZ, Shao QY, Miao Q, Zhai JH. Architecture selection for networks trained with extreme learning machine using localized generalization error model. Neurocomputing. 2013;102(2):3–9. Wang XZ, Shao QY, Miao Q, Zhai JH. Architecture selection for networks trained with extreme learning machine using localized generalization error model. Neurocomputing. 2013;102(2):3–9.
13.
go back to reference Tang J, Deng C, Huang GB, Zhao B. Compressed-domain ship detection on spaceborne optical image using deep neural network and extreme learning machine. IEEE Trans Geosci Remote Sens. 2014;53(3):1174–85.CrossRef Tang J, Deng C, Huang GB, Zhao B. Compressed-domain ship detection on spaceborne optical image using deep neural network and extreme learning machine. IEEE Trans Geosci Remote Sens. 2014;53(3):1174–85.CrossRef
14.
go back to reference Deng C, Wang S, Li Z, Huang G B, Lin W. Content-insensitive blind image blurriness assessment using Weibull statistics and sparse extreme learning machine. IEEE Trans Syst Man Cybern Syst. 2017;PP(99):1–12. Deng C, Wang S, Li Z, Huang G B, Lin W. Content-insensitive blind image blurriness assessment using Weibull statistics and sparse extreme learning machine. IEEE Trans Syst Man Cybern Syst. 2017;PP(99):1–12.
15.
go back to reference Gritsenko A, Akusok A, Baek S, Miche Y, Lendasse A. Extreme learning machines for visualization+r: mastering visualization with target variables. Cogn Comput. 2017;3:1–14. Gritsenko A, Akusok A, Baek S, Miche Y, Lendasse A. Extreme learning machines for visualization+r: mastering visualization with target variables. Cogn Comput. 2017;3:1–14.
16.
go back to reference Zhang Z, Zhao X, Wang G. Fe-elm: a new friend recommendation model with extreme learning machine. Cogn Comput. 2017;9(5):659–70.CrossRef Zhang Z, Zhao X, Wang G. Fe-elm: a new friend recommendation model with extreme learning machine. Cogn Comput. 2017;9(5):659–70.CrossRef
17.
go back to reference Wang B, Zhu R, Luo S, Yang X, Wang G. H-mrst: a novel framework for supporting probability degree range query using extreme learning machine. Cogn Comput. 2017;9(1):68–80.CrossRef Wang B, Zhu R, Luo S, Yang X, Wang G. H-mrst: a novel framework for supporting probability degree range query using extreme learning machine. Cogn Comput. 2017;9(1):68–80.CrossRef
18.
go back to reference Liu H, Qin J, Sun F, Guo D. Extreme kernel sparse learning for tactile object recognition. IEEE Trans Cybern. 2017;47(12):4509–20.CrossRef Liu H, Qin J, Sun F, Guo D. Extreme kernel sparse learning for tactile object recognition. IEEE Trans Cybern. 2017;47(12):4509–20.CrossRef
19.
go back to reference Vong CM, Ip WF, Chiu CC, Wong PK. Imbalanced learning for air pollution by meta-cognitive online sequential extreme learning machine. Cogn Comput. 2015;7(3):381–91.CrossRef Vong CM, Ip WF, Chiu CC, Wong PK. Imbalanced learning for air pollution by meta-cognitive online sequential extreme learning machine. Cogn Comput. 2015;7(3):381–91.CrossRef
20.
go back to reference Mao WT, Jiang M, Wang J, Li Y. Online extreme learning machine with hybrid sampling strategy for sequential imbalanced data. Cogn Comput. 2017;9(6):1–21.CrossRef Mao WT, Jiang M, Wang J, Li Y. Online extreme learning machine with hybrid sampling strategy for sequential imbalanced data. Cogn Comput. 2017;9(6):1–21.CrossRef
21.
go back to reference Horata P, Chiewchanwattana S, Sunat K. Robust extreme learning machine. Neurocomputing. 2013; 102(2):31–44.CrossRef Horata P, Chiewchanwattana S, Sunat K. Robust extreme learning machine. Neurocomputing. 2013; 102(2):31–44.CrossRef
22.
go back to reference Li K, Zhang J, Xu H, Luo S, Li H. A semi-supervised extreme learning machine method based on co-training. J Comput Inf Syst. 2013;9(1):207–14. Li K, Zhang J, Xu H, Luo S, Li H. A semi-supervised extreme learning machine method based on co-training. J Comput Inf Syst. 2013;9(1):207–14.
23.
go back to reference Huang G, Song S, Gupta J, Wu C. Semi-supervised and unsupervised extreme learning machines. IEEE Trans Cybern. 2017;44(12):2405–17.CrossRef Huang G, Song S, Gupta J, Wu C. Semi-supervised and unsupervised extreme learning machines. IEEE Trans Cybern. 2017;44(12):2405–17.CrossRef
24.
go back to reference Kasun LLC, Yang Y, Huang G-B, Zhang Z. Dimension reduction with extreme learning machine. IEEE Trans Image Process. 2016;25(8):3906–18.CrossRef Kasun LLC, Yang Y, Huang G-B, Zhang Z. Dimension reduction with extreme learning machine. IEEE Trans Image Process. 2016;25(8):3906–18.CrossRef
25.
go back to reference Tang J, Deng C, Huang G-B. Extreme learning machine for multilayer perceptron. IEEE Trans Neural Netw Learn Syst. 2016;27(4):809–21.CrossRef Tang J, Deng C, Huang G-B. Extreme learning machine for multilayer perceptron. IEEE Trans Neural Netw Learn Syst. 2016;27(4):809–21.CrossRef
26.
go back to reference Johnson W, Lindenstrauss J. Extensions of Lipschitz mappings into a Hilbert space. 1982;26:189–206. Johnson W, Lindenstrauss J. Extensions of Lipschitz mappings into a Hilbert space. 1982;26:189–206.
27.
go back to reference He K, Zhang X, Ren S, Sun J. Deep residual learning for image recognition. CVPR. 2016;2016:770–8. He K, Zhang X, Ren S, Sun J. Deep residual learning for image recognition. CVPR. 2016;2016:770–8.
28.
go back to reference Zhong G, Yan S, Huang K. Reducing and stretching deep convolutional activation features for accurate image classification. Cogn Comput. 2018;10:1–8.CrossRef Zhong G, Yan S, Huang K. Reducing and stretching deep convolutional activation features for accurate image classification. Cogn Comput. 2018;10:1–8.CrossRef
29.
go back to reference Wen G, Hou Z, Li H, Li D, Jiang L, Xun E. Ensemble of deep neural networks with probability-based fusion for facial expression recognition. Cogn Comput. 2017;9(5):597–10.CrossRef Wen G, Hou Z, Li H, Li D, Jiang L, Xun E. Ensemble of deep neural networks with probability-based fusion for facial expression recognition. Cogn Comput. 2017;9(5):597–10.CrossRef
31.
go back to reference Kasun LLC, Zhou H, Huang G-B, Wu C. Representational learning with extreme learning machine for big data. IEEE Intell Syst. 2013;28(6):31–4. Kasun LLC, Zhou H, Huang G-B, Wu C. Representational learning with extreme learning machine for big data. IEEE Intell Syst. 2013;28(6):31–4.
32.
go back to reference Yang Y, Wu QMJ. Multilayer extreme learning machine with subnetwork nodes for representation learning. IEEE Trans Cybern. 2016;46(11):2570–83.CrossRef Yang Y, Wu QMJ. Multilayer extreme learning machine with subnetwork nodes for representation learning. IEEE Trans Cybern. 2016;46(11):2570–83.CrossRef
33.
go back to reference Rahimi A, Recht B. Random features for large-scale kernel machines. Int Conf Neural Inf Process Syst. 2007:1177–84. Rahimi A, Recht B. Random features for large-scale kernel machines. Int Conf Neural Inf Process Syst. 2007:1177–84.
34.
go back to reference Cho Y, Saul LK. Kernel methods for deep learning. Adv Neural Inf Process Syst. 2012:342–50. Cho Y, Saul LK. Kernel methods for deep learning. Adv Neural Inf Process Syst. 2012:342–50.
35.
go back to reference Sinha A, Duchi J. Learning kernels with random features. Adv Neural Inf Process Syst. 2016:1298–306. Sinha A, Duchi J. Learning kernels with random features. Adv Neural Inf Process Syst. 2016:1298–306.
36.
go back to reference Perez-Suay A, Amoros-Lopez J, Gomez-Chova L. Randomized kernels for large scale earth observation applications. Remote Sens Environ. 2017;202(3):54–63.CrossRef Perez-Suay A, Amoros-Lopez J, Gomez-Chova L. Randomized kernels for large scale earth observation applications. Remote Sens Environ. 2017;202(3):54–63.CrossRef
37.
go back to reference Huang G-B, Chen L, Siew CK. Universal approximation using incremental constructive feedforward networks with random hidden nodes. IEEE Trans Neural Netw. 2006;17(4):879–92.CrossRef Huang G-B, Chen L, Siew CK. Universal approximation using incremental constructive feedforward networks with random hidden nodes. IEEE Trans Neural Netw. 2006;17(4):879–92.CrossRef
38.
go back to reference Vincent P, Larochelle H, Bengio Y, Manzagol PA. Extracting and composing robust features with denoising autoencoders. Int Conf Mach Learn. 2008:1096–103. Vincent P, Larochelle H, Bengio Y, Manzagol PA. Extracting and composing robust features with denoising autoencoders. Int Conf Mach Learn. 2008:1096–103.
39.
go back to reference Lecun Y, Kavukcuoglu K, Farabet C. Convolutional networks and applications in vision. IEEE Int Symp Circuits Syst. 2010:253–6. Lecun Y, Kavukcuoglu K, Farabet C. Convolutional networks and applications in vision. IEEE Int Symp Circuits Syst. 2010:253–6.
40.
go back to reference Liu X, Gao C, Li P. A comparative analysis of support vector machines and extreme learning machines. Elsevier Science Ltd. 2012. Liu X, Gao C, Li P. A comparative analysis of support vector machines and extreme learning machines. Elsevier Science Ltd. 2012.
41.
go back to reference Lcun Y, Bottou L, Bengio Y, Haffner P. Gradient-based learning applied to document recognition. Proc IEEE. 2001;86(11):2278–324.CrossRef Lcun Y, Bottou L, Bengio Y, Haffner P. Gradient-based learning applied to document recognition. Proc IEEE. 2001;86(11):2278–324.CrossRef
42.
go back to reference Nene SA, Nayar SK, Murase H. 1996. Columbia object image library (COIL-20) Technical Report CUCS-005-96. Nene SA, Nayar SK, Murase H. 1996. Columbia object image library (COIL-20) Technical Report CUCS-005-96.
43.
go back to reference Leibe B, Schiele B. Analyzing appearance and contour based methods for object categorization, CVPR 2003. 2003. p. II–409–15 vol. 2. Leibe B, Schiele B. Analyzing appearance and contour based methods for object categorization, CVPR 2003. 2003. p. II–409–15 vol. 2.
44.
go back to reference Lecun Y, Huang FJ, Bottou L. Learning methods for generic object recognition with invariance to pose and lighting, CVPR 2004. 2004. p. II–97–104 vol. 2. Lecun Y, Huang FJ, Bottou L. Learning methods for generic object recognition with invariance to pose and lighting, CVPR 2004. 2004. p. II–97–104 vol. 2.
45.
go back to reference Blake CL, Merz CJ. 1998. UCI Repository of machine learning databases. Dept. Inf. Comput. Sci., Univ. California, Irvine. Blake CL, Merz CJ. 1998. UCI Repository of machine learning databases. Dept. Inf. Comput. Sci., Univ. California, Irvine.
46.
go back to reference Larochelle H, Erhan D, Courville A, Bergstra J, Bengio Y. An empirical evaluation of deep architectures on problems with many factors of variation. Int Conf Mach Learn. 2007:473–80. Larochelle H, Erhan D, Courville A, Bergstra J, Bengio Y. An empirical evaluation of deep architectures on problems with many factors of variation. Int Conf Mach Learn. 2007:473–80.
47.
go back to reference Hinton GE, Salakhutdinov RR. Reducing the dimensionality of data with neural networks. Science. 2006; 313(5786):504.CrossRef Hinton GE, Salakhutdinov RR. Reducing the dimensionality of data with neural networks. Science. 2006; 313(5786):504.CrossRef
48.
go back to reference Hinton GE, Osindero S, Teh YW. 2006. A fast learning algorithm for deep belief nets. MIT Press. Hinton GE, Osindero S, Teh YW. 2006. A fast learning algorithm for deep belief nets. MIT Press.
49.
go back to reference Zhang J, Ding S, Zhang N, Xue Y. Weight uncertainty in Boltzmann machine. Cogn Comput 2016;8(6):1064–73.CrossRef Zhang J, Ding S, Zhang N, Xue Y. Weight uncertainty in Boltzmann machine. Cogn Comput 2016;8(6):1064–73.CrossRef
50.
go back to reference Kavukcuoglu K, Boureau YL, Boureau YL, Gregor K, Lecun Y. Learning convolutional feature hierarchies for visual recognition. Int Conf Neural Inf Process Syst. 20010:1090–8. Kavukcuoglu K, Boureau YL, Boureau YL, Gregor K, Lecun Y. Learning convolutional feature hierarchies for visual recognition. Int Conf Neural Inf Process Syst. 20010:1090–8.
Metadata
Title
Conditional Random Mapping for Effective ELM Feature Representation
Authors
Cheng Li
Chenwei Deng
Shichao Zhou
Baojun Zhao
Guang-Bin Huang
Publication date
11-05-2018
Publisher
Springer US
Published in
Cognitive Computation / Issue 5/2018
Print ISSN: 1866-9956
Electronic ISSN: 1866-9964
DOI
https://doi.org/10.1007/s12559-018-9557-x

Other articles of this Issue 5/2018

Cognitive Computation 5/2018 Go to the issue

Premium Partner