Skip to main content
Erschienen in: Cognitive Computation 6/2018

01.10.2018

Deep Weighted Extreme Learning Machine

verfasst von: Tianlei Wang, Jiuwen Cao, Xiaoping Lai, Badong Chen

Erschienen in: Cognitive Computation | Ausgabe 6/2018

Einloggen

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

The imbalanced data classification attracts increasing attention in the past years due to the continuous expansion of data available in many areas, such as biomedical engineering, surveillance, and computer vision. Learning from imbalanced data is challenging as most standard algorithms fail to properly represent the inherent complex characteristics of the data distribution. As an emerging technology, the extreme learning machine (ELM) and its variants, including the weighted ELM (WELM) and the boosting weighted ELM (BWELM), have been recently developed for the classification of imbalanced data. However, the WELM suffers the following deficiencies: (i) the sample weight matrix is manually chosen and fixed during the learning phase; (ii) the representation capability, namely the capability to extract features or useful information from the original data, is insufficiently explored due to its shallow structure. The BWELM employs the AdaBoost algorithm to optimize the sample weights. But the representation capability is still restricted by the shallow structure. To alleviate these deficiencies, we propose a novel deep weighted ELM (DWELM) algorithm for imbalanced data classification in this paper. An enhanced stacked multilayer deep representation network trained with the ELM (EH-DrELM) is first proposed to improve the representation capability, and a fast AdaBoost algorithm for imbalanced multiclass data (AdaBoost-ID) is developed to optimize the sample weights. Then, the novel DWELM for the imbalance learning is obtained by combining the above two algorithms. Experimental results on nine imbalanced binary-class datasets, nine imbalanced multiclass datasets, and five large benchmark datasets (three for multiclass and two for binary-class) show that the proposed DWELM achieves a better performance than the WELM and BWELM, as well as several state-of-the-art multilayer network-based learning algorithms.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Literatur
1.
Zurück zum Zitat Zong W, Huang G-B, Chen Y. Weighted extreme learning machine for imbalance learning. Neurocomputing 2013;101:229–242.CrossRef Zong W, Huang G-B, Chen Y. Weighted extreme learning machine for imbalance learning. Neurocomputing 2013;101:229–242.CrossRef
2.
Zurück zum Zitat Yu W, Zhuang F, He Q, Shi Z. Learning deep representations via extreme learning machines. Neurocomputing 2015;149:308–315.CrossRef Yu W, Zhuang F, He Q, Shi Z. Learning deep representations via extreme learning machines. Neurocomputing 2015;149:308–315.CrossRef
3.
Zurück zum Zitat Li K, Kong X, Lu Z, Liu W, Yin J. Boosting weighted ELM for imbalanced learning. Neurocomputing 2014;128(5):15–21.CrossRef Li K, Kong X, Lu Z, Liu W, Yin J. Boosting weighted ELM for imbalanced learning. Neurocomputing 2014;128(5):15–21.CrossRef
4.
Zurück zum Zitat Vinyals O, Jia Y, Deng L, Darrell T. Learning with recursive perceptual representations. Neural Information Processing Systems 2012;15:2834–2842. Vinyals O, Jia Y, Deng L, Darrell T. Learning with recursive perceptual representations. Neural Information Processing Systems 2012;15:2834–2842.
5.
Zurück zum Zitat Friedman J, Hastie T, Ti bshirani R. Additive logistic regression: a statistical view of boosting. Ann Stat 2000;28(2):337–407.CrossRef Friedman J, Hastie T, Ti bshirani R. Additive logistic regression: a statistical view of boosting. Ann Stat 2000;28(2):337–407.CrossRef
6.
Zurück zum Zitat Jiang Y, Shen Y, Liu Y, Liu W. Multiclass AdaBoost ELM and Its Application in LBP Based Face Recognition. Math Probl Eng 2015;2015:1–9. Jiang Y, Shen Y, Liu Y, Liu W. Multiclass AdaBoost ELM and Its Application in LBP Based Face Recognition. Math Probl Eng 2015;2015:1–9.
7.
Zurück zum Zitat Xia S, Meng F, Liu B, Zhou Y. A Kernel Clustering-Based Possibilistic Fuzzy Extreme Learning Machine for Class Imbalance Learning. Cogn Comput 2015;1(1):74–85.CrossRef Xia S, Meng F, Liu B, Zhou Y. A Kernel Clustering-Based Possibilistic Fuzzy Extreme Learning Machine for Class Imbalance Learning. Cogn Comput 2015;1(1):74–85.CrossRef
8.
Zurück zum Zitat Savitha R, Suresh S, Kim HJ. A meta-cognitive learning algorithm for an extreme learning machine classifier. Cogn Comput 2014;6(2):253–263.CrossRef Savitha R, Suresh S, Kim HJ. A meta-cognitive learning algorithm for an extreme learning machine classifier. Cogn Comput 2014;6(2):253–263.CrossRef
9.
Zurück zum Zitat Vong C, Ip W, Chiu C, Wong P. Imbalanced learning for air pollution by meta-cognitive online sequential extreme learning machine. Cogn Comput 2015;7(3):381–391.CrossRef Vong C, Ip W, Chiu C, Wong P. Imbalanced learning for air pollution by meta-cognitive online sequential extreme learning machine. Cogn Comput 2015;7(3):381–391.CrossRef
10.
Zurück zum Zitat Chamara L, Zhou H, Huang G-B, Vong C-M. Representational learning with extreme learning machine for big data. IEEE Intell Syst 2013;28(6):31–34. Chamara L, Zhou H, Huang G-B, Vong C-M. Representational learning with extreme learning machine for big data. IEEE Intell Syst 2013;28(6):31–34.
11.
Zurück zum Zitat Tang J, Deng C, Huang G-B. Extreme learning machine for multilayer perceptron. IEEE Trans Neural Netw Learn Syst 2016;27(4):809–821.PubMedCrossRef Tang J, Deng C, Huang G-B. Extreme learning machine for multilayer perceptron. IEEE Trans Neural Netw Learn Syst 2016;27(4):809–821.PubMedCrossRef
12.
Zurück zum Zitat Wong CM, Vong C, Wong P, Cao J. Kernel-based multilayer extreme learning machines for representation learning. IEEE Trans Neural Netw Learn Syst 2018;29(3):757–762.PubMedCrossRef Wong CM, Vong C, Wong P, Cao J. Kernel-based multilayer extreme learning machines for representation learning. IEEE Trans Neural Netw Learn Syst 2018;29(3):757–762.PubMedCrossRef
13.
Zurück zum Zitat Huang G-B, Zhu Q-Y, Siew C-K. Extreme learning machine: theory and applications. Neurocomputing 2006;70(1-3):489–501.CrossRef Huang G-B, Zhu Q-Y, Siew C-K. Extreme learning machine: theory and applications. Neurocomputing 2006;70(1-3):489–501.CrossRef
14.
Zurück zum Zitat Huang G-B, Zhou H, Ding X, Zhang R. Extreme learning machine for regression and multiclass classification. IEEE Trans Syst Man Cybern 2012;42(2):513–529.CrossRef Huang G-B, Zhou H, Ding X, Zhang R. Extreme learning machine for regression and multiclass classification. IEEE Trans Syst Man Cybern 2012;42(2):513–529.CrossRef
15.
Zurück zum Zitat Deng W, Zheng Q, Chen L. 2009. Regularized extreme learning machine. In: IEEE Symposium on Computational Intelligence and Data Mining, pp 389–395, Deng W, Zheng Q, Chen L. 2009. Regularized extreme learning machine. In: IEEE Symposium on Computational Intelligence and Data Mining, pp 389–395,
16.
Zurück zum Zitat Huang G-B, Chen L, Siew C-K. Universal approximation using incremental constructive feedforward networks with random hidden nodes. Neural Netw Jul. 2006;17(4):879–892. Huang G-B, Chen L, Siew C-K. Universal approximation using incremental constructive feedforward networks with random hidden nodes. Neural Netw Jul. 2006;17(4):879–892.
17.
Zurück zum Zitat Cao J, Zhang K, Luo M, Yin C, Lai X. Extreme learining machine and adaptive sparse representation for image classification. Neural Netw 2016;81:91–102.PubMedCrossRef Cao J, Zhang K, Luo M, Yin C, Lai X. Extreme learining machine and adaptive sparse representation for image classification. Neural Netw 2016;81:91–102.PubMedCrossRef
18.
Zurück zum Zitat Cao J, Hao J, Lai X, Vong C-M, Luo M. Ensemble extreme learning machine and sparse representation classification algorithm. J Frankl Inst 2016;353(17):4526–4541.CrossRef Cao J, Hao J, Lai X, Vong C-M, Luo M. Ensemble extreme learning machine and sparse representation classification algorithm. J Frankl Inst 2016;353(17):4526–4541.CrossRef
19.
Zurück zum Zitat Zhang R, Lan Y, Huang G-B, Xu Z, Soh Y. Dynamic extreme learning machine and its approximation capability. IEEE Trans Cybern 2013;43(6):2054–2065.PubMedCrossRef Zhang R, Lan Y, Huang G-B, Xu Z, Soh Y. Dynamic extreme learning machine and its approximation capability. IEEE Trans Cybern 2013;43(6):2054–2065.PubMedCrossRef
20.
Zurück zum Zitat Li S, You ZH, Guo H, Luo X, Zhao ZQ. Inverse-free extreme learning machine with optimal information updating. IEEE Trans Cybern 2016;46(5):1229–1241.PubMedCrossRef Li S, You ZH, Guo H, Luo X, Zhao ZQ. Inverse-free extreme learning machine with optimal information updating. IEEE Trans Cybern 2016;46(5):1229–1241.PubMedCrossRef
21.
Zurück zum Zitat Chen B, Xing L, Wang X, Qin J, Zheng N. Robust learning with kernel mean p-power error loss. IEEE Transactions on Cybernetics 2017;PP(99):1–13. Chen B, Xing L, Wang X, Qin J, Zheng N. Robust learning with kernel mean p-power error loss. IEEE Transactions on Cybernetics 2017;PP(99):1–13.
22.
Zurück zum Zitat Chen B, Zhao S, Zhu P, Principe J. Quantized kernel recursive least squares algorithm. IEEE Trans Neural Netw Learn Syst 2013;24(9):1484–1491.PubMedCrossRef Chen B, Zhao S, Zhu P, Principe J. Quantized kernel recursive least squares algorithm. IEEE Trans Neural Netw Learn Syst 2013;24(9):1484–1491.PubMedCrossRef
23.
Zurück zum Zitat Nan S, Sun L, Chen B, Lin Z, Toh K. Density-dependent quantized least squares support vector machine for large data sets. IEEE Trans Neural Netw Learn Syst 2017;28(1):94–106.PubMedCrossRef Nan S, Sun L, Chen B, Lin Z, Toh K. Density-dependent quantized least squares support vector machine for large data sets. IEEE Trans Neural Netw Learn Syst 2017;28(1):94–106.PubMedCrossRef
24.
Zurück zum Zitat Xu L, Ding S, Xu X, Zhang N. Self-adaptive extreme learning machine optimized by rough set theory and affinity propagation clustering. Cogn Comput 2016;8(4):720–728.CrossRef Xu L, Ding S, Xu X, Zhang N. Self-adaptive extreme learning machine optimized by rough set theory and affinity propagation clustering. Cogn Comput 2016;8(4):720–728.CrossRef
25.
Zurück zum Zitat Mohammed AA, Minhas R, Wu QMJ, Sid-Ahmed MA. Human face recognition based on multidimensional PCA and extreme learning machine. Pattern Recogn 2011;44(10-11):2588–2597.CrossRef Mohammed AA, Minhas R, Wu QMJ, Sid-Ahmed MA. Human face recognition based on multidimensional PCA and extreme learning machine. Pattern Recogn 2011;44(10-11):2588–2597.CrossRef
26.
Zurück zum Zitat Chen T, Yap KH. Discriminative BoW framework for mobile landmark recognition. IEEE Trans Cybern 2014; 44(5):695–706.PubMedCrossRef Chen T, Yap KH. Discriminative BoW framework for mobile landmark recognition. IEEE Trans Cybern 2014; 44(5):695–706.PubMedCrossRef
27.
Zurück zum Zitat Iosifidis A, Tefas A, Pitas I. Minimum class variance extreme learning machine for human action recognition. IEEE Trans Circ Syst Video Technol 2013;23(11):1968–1979.CrossRef Iosifidis A, Tefas A, Pitas I. Minimum class variance extreme learning machine for human action recognition. IEEE Trans Circ Syst Video Technol 2013;23(11):1968–1979.CrossRef
28.
Zurück zum Zitat Cao J, Wang W, Wang J, Wang R. Excavation equipment recognition based on novel acoustic statistical features. IEEE Trans Cybern 2017;47(12):4392–4404.PubMedCrossRef Cao J, Wang W, Wang J, Wang R. Excavation equipment recognition based on novel acoustic statistical features. IEEE Trans Cybern 2017;47(12):4392–4404.PubMedCrossRef
29.
Zurück zum Zitat Cao J, Shang L, Wang J, Vong C, Yin C, Cheng Y, Huang X. A novel distance estimation algorithm for periodic surface vibrations based on frequency band energy percentage feature. Mech Syst Signal Process 2018;113:222–236.CrossRef Cao J, Shang L, Wang J, Vong C, Yin C, Cheng Y, Huang X. A novel distance estimation algorithm for periodic surface vibrations based on frequency band energy percentage feature. Mech Syst Signal Process 2018;113:222–236.CrossRef
30.
Zurück zum Zitat Wang S, Deng C, Lin W, Huang GB, Zhao B. NMF-Based image quality assessment using extreme learning machine. IEEE Trans Cybern 2017;47(1):232–243.PubMedCrossRef Wang S, Deng C, Lin W, Huang GB, Zhao B. NMF-Based image quality assessment using extreme learning machine. IEEE Trans Cybern 2017;47(1):232–243.PubMedCrossRef
32.
Zurück zum Zitat Liu Y, Vong C, Wong P. Extreme learning machine for huge hypotheses re-ranking in statistical machine translation. Cogn Comput 2017;9(2):285–294.CrossRef Liu Y, Vong C, Wong P. Extreme learning machine for huge hypotheses re-ranking in statistical machine translation. Cogn Comput 2017;9(2):285–294.CrossRef
33.
Zurück zum Zitat Duan L, Bao M, Cui S, Qiao Y, Miao J. Motor imagery EEG classification based on kernel hierarchical extreme learning machine. Cogn Comput 2017;9(6):758–765.CrossRef Duan L, Bao M, Cui S, Qiao Y, Miao J. Motor imagery EEG classification based on kernel hierarchical extreme learning machine. Cogn Comput 2017;9(6):758–765.CrossRef
34.
Zurück zum Zitat Wang J, Ye K, Cao J, Wang T, Xue A, Cheng Y, Yin C. DOA estimation of excavation devices with ELM and MUSIC-based Hybrid Algorithm. Cogn Comput 2017;9(4):564–580.CrossRef Wang J, Ye K, Cao J, Wang T, Xue A, Cheng Y, Yin C. DOA estimation of excavation devices with ELM and MUSIC-based Hybrid Algorithm. Cogn Comput 2017;9(4):564–580.CrossRef
36.
Zurück zum Zitat Chou C, Kuo B, Chang F. 2006. The generalized condensed nearest neighbor rule as a data reduction method. In: 18th International Conference on Pattern Recognition, pp 556–559. Chou C, Kuo B, Chang F. 2006. The generalized condensed nearest neighbor rule as a data reduction method. In: 18th International Conference on Pattern Recognition, pp 556–559.
37.
Zurück zum Zitat Kubat M, Matwin S. 1997. Addressing the curse of imbalanced training sets: one-sided selection. International Conference on Machine Learning. Kubat M, Matwin S. 1997. Addressing the curse of imbalanced training sets: one-sided selection. International Conference on Machine Learning.
38.
Zurück zum Zitat Laurikkala J. Improving identification of difficult small classes by balancing CLASS distribution. Artificial Intelligence in Medicine 2001;2101:63–66.CrossRef Laurikkala J. Improving identification of difficult small classes by balancing CLASS distribution. Artificial Intelligence in Medicine 2001;2101:63–66.CrossRef
39.
Zurück zum Zitat Tang Y, Zhang YQ, Chawla NV, Krasser S. SVMS modeling for highly imbalanced classification. IEEE Trans Syst Man Cybern Part B (Cybern) 2009;39(1):281–288.CrossRef Tang Y, Zhang YQ, Chawla NV, Krasser S. SVMS modeling for highly imbalanced classification. IEEE Trans Syst Man Cybern Part B (Cybern) 2009;39(1):281–288.CrossRef
40.
Zurück zum Zitat Chawla NV, Bowyer KW, Hall LO, Kegelmeyer WP. SMOTE: synthetic minority over-sampling technique. J Artif Intell Res 2002;16(1):321–357.CrossRef Chawla NV, Bowyer KW, Hall LO, Kegelmeyer WP. SMOTE: synthetic minority over-sampling technique. J Artif Intell Res 2002;16(1):321–357.CrossRef
41.
Zurück zum Zitat Han H, Wang W, Mao B. Borderline-SMOTE: a new over-sampling method in imbalanced data sets learning. Int Conf Intell Comput 2005;3644:878–887. Han H, Wang W, Mao B. Borderline-SMOTE: a new over-sampling method in imbalanced data sets learning. Int Conf Intell Comput 2005;3644:878–887.
42.
Zurück zum Zitat Cervantes J, Li X, Yu W. Using genetic algorithm to improve classification accuracy on imbalanced data. 2013 IEEE International Conference on Systems, Man, and Cybernetics, Manchester; 2013. p. 2659–2664. Cervantes J, Li X, Yu W. Using genetic algorithm to improve classification accuracy on imbalanced data. 2013 IEEE International Conference on Systems, Man, and Cybernetics, Manchester; 2013. p. 2659–2664.
43.
Zurück zum Zitat Sun Y, Kamel MS, Wong AK, Wang Y. Cost-sensitive boosting for classification of imbalanced data. Pattern Recogn 2007;40(12):3358–3378.CrossRef Sun Y, Kamel MS, Wong AK, Wang Y. Cost-sensitive boosting for classification of imbalanced data. Pattern Recogn 2007;40(12):3358–3378.CrossRef
44.
Zurück zum Zitat Zhou Z, Liu X. Training cost-sensitive neural networks with methods addressing the class imbalance problem. IEEE Trans Knowl Data Eng 2006;18(1):63–77.CrossRef Zhou Z, Liu X. Training cost-sensitive neural networks with methods addressing the class imbalance problem. IEEE Trans Knowl Data Eng 2006;18(1):63–77.CrossRef
45.
Zurück zum Zitat Gao M, Hong X, Chen S, Harris CJ. A combined SMOTE and PSO based RBF classifier for two-class imbalanced problems. Neurocomputing 2011;74(17):3456–3466.CrossRef Gao M, Hong X, Chen S, Harris CJ. A combined SMOTE and PSO based RBF classifier for two-class imbalanced problems. Neurocomputing 2011;74(17):3456–3466.CrossRef
46.
Zurück zum Zitat Hong X, Chen S, Harris CJ. A kernel-based two-class classifier for imbalanced data sets. IEEE Trans Neural Netw 2007;18(1):28–41.PubMedCrossRef Hong X, Chen S, Harris CJ. A kernel-based two-class classifier for imbalanced data sets. IEEE Trans Neural Netw 2007;18(1):28–41.PubMedCrossRef
47.
Zurück zum Zitat Mathew J, Luo M, Pang CK, Chan HL. Kernel-based SMOTE for SVM classification of imbalanced datasets. In: IECON 2015 - 41st Annual Conference of the IEEE Industrial Electronics Society; 2015. p. 1127–1132. Mathew J, Luo M, Pang CK, Chan HL. Kernel-based SMOTE for SVM classification of imbalanced datasets. In: IECON 2015 - 41st Annual Conference of the IEEE Industrial Electronics Society; 2015. p. 1127–1132.
48.
Zurück zum Zitat He H, Garcia EA. Learning from imbalanced data. IEEE Trans Knowl Data Eng 2009;21(9):1263–1284.CrossRef He H, Garcia EA. Learning from imbalanced data. IEEE Trans Knowl Data Eng 2009;21(9):1263–1284.CrossRef
49.
Zurück zum Zitat Cohen G, Hilario M, Sax H, Hugonnet S, Geissbuhler A. Learning from imbalanced data in surveillance of nosocomial infection. Artif Intell Med 2006;37(1):7–18.PubMedCrossRef Cohen G, Hilario M, Sax H, Hugonnet S, Geissbuhler A. Learning from imbalanced data in surveillance of nosocomial infection. Artif Intell Med 2006;37(1):7–18.PubMedCrossRef
50.
Zurück zum Zitat Rao RB, Krishnan S, Niculescu RS. Data mining for improved cardiac care. ACM SIGKDD Explor Newsl 2006;8(1):3–10.CrossRef Rao RB, Krishnan S, Niculescu RS. Data mining for improved cardiac care. ACM SIGKDD Explor Newsl 2006;8(1):3–10.CrossRef
51.
Zurück zum Zitat Yu C, Chou L, Chang D. Predicting protein-protein interactions in unbalanced data using the primary structure of proteins. BMC Bioinforma 2010;11(1):167–177.CrossRef Yu C, Chou L, Chang D. Predicting protein-protein interactions in unbalanced data using the primary structure of proteins. BMC Bioinforma 2010;11(1):167–177.CrossRef
52.
Zurück zum Zitat Provost F, Fawcett T, Kohavi R. The case against accuracy estimation for comparing induction algorithms. In: Proceedings of the 15th International Conference on Machine Learning; 1998. p. 445–453. Provost F, Fawcett T, Kohavi R. The case against accuracy estimation for comparing induction algorithms. In: Proceedings of the 15th International Conference on Machine Learning; 1998. p. 445–453.
53.
Zurück zum Zitat Fawcett T, Provost F. Adaptive fraud detection. Data Min Knowl Disc 1997;1:291–316.CrossRef Fawcett T, Provost F. Adaptive fraud detection. Data Min Knowl Disc 1997;1:291–316.CrossRef
54.
Zurück zum Zitat Cieslak DA, Chawla NV, Striegel A. Combating imbalance in network intrusion datasets. In: Proceedings of the 2006 IEEE International Conference on Granular Computing; 2006. p. 732–737. Cieslak DA, Chawla NV, Striegel A. Combating imbalance in network intrusion datasets. In: Proceedings of the 2006 IEEE International Conference on Granular Computing; 2006. p. 732–737.
55.
Zurück zum Zitat Weiss GM, Hirsh H. Learning to predict rare events in event sequences. In: Proceedings of the Fourth IEEE International Conference on Knowledge Discovery and Data Mining; 1998. p. 359–363. Weiss GM, Hirsh H. Learning to predict rare events in event sequences. In: Proceedings of the Fourth IEEE International Conference on Knowledge Discovery and Data Mining; 1998. p. 359–363.
56.
Zurück zum Zitat Mao W, Wang J, Wang L. Online sequential classification of imbalanced data by combining extreme learning machine and improved SMOTE algorithm. In: 2015 International Joint Conference on Neural Networks; 2015. p. 1–6. Mao W, Wang J, Wang L. Online sequential classification of imbalanced data by combining extreme learning machine and improved SMOTE algorithm. In: 2015 International Joint Conference on Neural Networks; 2015. p. 1–6.
57.
Zurück zum Zitat Shukla S, Yadav RN. Regularized weighted circular complex-valued extreme learning machine for imbalanced learning. IEEE Access 2015;3:3048–3057.CrossRef Shukla S, Yadav RN. Regularized weighted circular complex-valued extreme learning machine for imbalanced learning. IEEE Access 2015;3:3048–3057.CrossRef
58.
Zurück zum Zitat Sharma R, Bist AS. Genetic algorithm based weighted extreme learning machine for binary imbalance learning. In: 2015 International Conference on Cognitive Computing and Information Processing; 2015. p. 1-6. Sharma R, Bist AS. Genetic algorithm based weighted extreme learning machine for binary imbalance learning. In: 2015 International Conference on Cognitive Computing and Information Processing; 2015. p. 1-6.
59.
Zurück zum Zitat Ren F, Cao P, Li W, Zhao D, Zaiane O. Ensemble based adaptive over-sampling method for imbalanced data learning in computer aided detection of microaneurysm. Computerized Medical Imaging and Graphics. 2017. Ren F, Cao P, Li W, Zhao D, Zaiane O. Ensemble based adaptive over-sampling method for imbalanced data learning in computer aided detection of microaneurysm. Computerized Medical Imaging and Graphics. 2017.
60.
Zurück zum Zitat He K, Zhang X, Ren S, Sun J. Deep residual learning for image recognition. In: 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR); 2016. p. 770-778. He K, Zhang X, Ren S, Sun J. Deep residual learning for image recognition. In: 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR); 2016. p. 770-778.
61.
Zurück zum Zitat Achlioptas D. Database-friendly random projections. In: 20th ACM SIGMOD-SIGACT-SIGART Symposium on Principles of Database Systems; 2001. p. 274–281. Achlioptas D. Database-friendly random projections. In: 20th ACM SIGMOD-SIGACT-SIGART Symposium on Principles of Database Systems; 2001. p. 274–281.
62.
Zurück zum Zitat Hinton G, Salakhutdinov R. Reducing the dimensionality of data with neural networks. Science 2006;313 (5786):504–507.PubMedCrossRef Hinton G, Salakhutdinov R. Reducing the dimensionality of data with neural networks. Science 2006;313 (5786):504–507.PubMedCrossRef
63.
Zurück zum Zitat Hinton G, Salakhutdinov R. Reducing the dimensionality of data with neural networks. Science 2006;313 (5786):504–507.PubMedCrossRef Hinton G, Salakhutdinov R. Reducing the dimensionality of data with neural networks. Science 2006;313 (5786):504–507.PubMedCrossRef
64.
Zurück zum Zitat Krizhevsky A, Sutskever I, Hinton G. Imagenet classification with deep convolutional neural networks. Advances in neural information processing systems. 2012. Krizhevsky A, Sutskever I, Hinton G. Imagenet classification with deep convolutional neural networks. Advances in neural information processing systems. 2012.
Metadaten
Titel
Deep Weighted Extreme Learning Machine
verfasst von
Tianlei Wang
Jiuwen Cao
Xiaoping Lai
Badong Chen
Publikationsdatum
01.10.2018
Verlag
Springer US
Erschienen in
Cognitive Computation / Ausgabe 6/2018
Print ISSN: 1866-9956
Elektronische ISSN: 1866-9964
DOI
https://doi.org/10.1007/s12559-018-9602-9

Weitere Artikel der Ausgabe 6/2018

Cognitive Computation 6/2018 Zur Ausgabe

Premium Partner