Skip to main content
Erschienen in:

27.08.2022

C-Loss-Based Doubly Regularized Extreme Learning Machine

verfasst von: Qing Wu, Yan–Lin Fu, Dong–Shun Cui, En Wang

Erschienen in: Cognitive Computation | Ausgabe 2/2023

Einloggen

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

Extreme learning machine has become a significant learning methodology due to its efficiency. However, extreme learning machine may lead to overfitting since it is highly sensitive to outliers. In this paper, a novel extreme learning machine called the C-loss-based doubly regularized extreme learning machine is presented to handle dimensionality reduction and overfitting problems. The proposed algorithm benefits from both L1 norm and L2 norm and replaces the square loss function with a C-loss function. And the C-loss-based doubly regularized extreme learning machine can complete the feature selection and the training processes simultaneously. Additionally, it can also decrease noise or irrelevant information of data to reduce dimensionality. To show the efficiency in dimension reduction, we test it on the Swiss Roll dataset and obtain high efficiency and stable performance. The experimental results on different types of artificial datasets and benchmark datasets show that the proposed method achieves much better regression results and faster training speed than other compared methods. Performance analysis also shows it significantly decreases the training time, solves the problem of overfitting, and improves generalization ability.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Literatur
1.
Zurück zum Zitat Rumelhart DE, Hinton GE, Williams RJ. Learning representations by back–propagating errors. Nature. 1986;323:533–6.CrossRefMATH Rumelhart DE, Hinton GE, Williams RJ. Learning representations by back–propagating errors. Nature. 1986;323:533–6.CrossRefMATH
2.
Zurück zum Zitat Vapnik V, Golowich S, Smola A. Support vector method for function approximation, regression estimation, and signal processing. The 9th Int Conf Neural Inform Proc Sys. 1996;281–287. Vapnik V, Golowich S, Smola A. Support vector method for function approximation, regression estimation, and signal processing. The 9th Int Conf Neural Inform Proc Sys. 1996;281–287.
3.
Zurück zum Zitat Furfaro R, Barocco R, Linares R, Topputo F, Reddy V, Simo J, et al. Modeling irregular small bodies gravity field via extreme learning machines and Bayesian optimization. Adv Space Res. 2020;67(1):617–38.CrossRef Furfaro R, Barocco R, Linares R, Topputo F, Reddy V, Simo J, et al. Modeling irregular small bodies gravity field via extreme learning machines and Bayesian optimization. Adv Space Res. 2020;67(1):617–38.CrossRef
4.
Zurück zum Zitat Huang GB, Zhu QY, Siew CK. Extreme learning machine: theory and applications. Neurocomputing. 2006;70(1–3):489–501.CrossRef Huang GB, Zhu QY, Siew CK. Extreme learning machine: theory and applications. Neurocomputing. 2006;70(1–3):489–501.CrossRef
5.
Zurück zum Zitat Kaleem K, Wu YZ, Adjeisah M. Consonant phoneme based extreme learning machine (ELM) recognition model for foreign accent identification. The World Symp Software Eng. 2019;68–72. Kaleem K, Wu YZ, Adjeisah M. Consonant phoneme based extreme learning machine (ELM) recognition model for foreign accent identification. The World Symp Software Eng. 2019;68–72.
6.
Zurück zum Zitat Liu X, Huang H, Xiang J. A personalized diagnosis method to detect faults in gears using numerical simulation and extreme learning machine. Knowl Based Syst. 2020;195(1): 105653.CrossRef Liu X, Huang H, Xiang J. A personalized diagnosis method to detect faults in gears using numerical simulation and extreme learning machine. Knowl Based Syst. 2020;195(1): 105653.CrossRef
7.
Zurück zum Zitat Fellx A, Daniela G, Liviu V, Mihaela–Alexandra P. Neural network approaches for children's emotion recognition in intelligent learning applications. The 7th Int Conf Education and New Learning Technol. 2015;3229–3239. Fellx A, Daniela G, Liviu V, Mihaela–Alexandra P. Neural network approaches for children's emotion recognition in intelligent learning applications. The 7th Int Conf Education and New Learning Technol. 2015;3229–3239.
8.
Zurück zum Zitat Huang GB, Zhou H, Ding X. Extreme learning machine for regression and multiclass classification. IEEE Trans Syst Man Cybern B. 2011;42(2):513–29.CrossRef Huang GB, Zhou H, Ding X. Extreme learning machine for regression and multiclass classification. IEEE Trans Syst Man Cybern B. 2011;42(2):513–29.CrossRef
9.
Zurück zum Zitat Huang S, Zhao G, Chen M. Tensor extreme learning design via generalized Moore-Penrose inverse and triangular type–2 fuzzy sets. Neural Comput Applical. 2018;31:5641–51.CrossRef Huang S, Zhao G, Chen M. Tensor extreme learning design via generalized Moore-Penrose inverse and triangular type–2 fuzzy sets. Neural Comput Applical. 2018;31:5641–51.CrossRef
10.
Zurück zum Zitat Bai Z, Huang GB, Wang D. Sparse Extreme learning machine for classification. IEEE Trans Cybern. 2014;44(10):1858–70.CrossRef Bai Z, Huang GB, Wang D. Sparse Extreme learning machine for classification. IEEE Trans Cybern. 2014;44(10):1858–70.CrossRef
11.
Zurück zum Zitat Wang Y, Yang L, Yuan C. A robust outlier control framework for classification designed with family of homotopy loss function. Neural Netw. 2019;112:41–53.CrossRefMATH Wang Y, Yang L, Yuan C. A robust outlier control framework for classification designed with family of homotopy loss function. Neural Netw. 2019;112:41–53.CrossRefMATH
12.
Zurück zum Zitat Deng WY, Zheng Q, Lin C. Regularized extreme learning machine. IEEE symposium on computational intelligence and data mining. 2009;2009:389–95.CrossRef Deng WY, Zheng Q, Lin C. Regularized extreme learning machine. IEEE symposium on computational intelligence and data mining. 2009;2009:389–95.CrossRef
13.
Zurück zum Zitat Balasundaram S, Gupta D. 1–Norm extreme learning machine for regression and multiclass classification using Newton method. Neurocomputing. 2014;128:4–14.CrossRef Balasundaram S, Gupta D. 1–Norm extreme learning machine for regression and multiclass classification using Newton method. Neurocomputing. 2014;128:4–14.CrossRef
14.
15.
Zurück zum Zitat Luo X, Chang XH, Ban XJ. Regression and classification using extreme learning machine based on L-1-norm and L-2-norm. Neurocomputing. 2016;174:179–86.CrossRef Luo X, Chang XH, Ban XJ. Regression and classification using extreme learning machine based on L-1-norm and L-2-norm. Neurocomputing. 2016;174:179–86.CrossRef
16.
Zurück zum Zitat Abhishek S, Rosha P, Jose P. The C–loss function for pattern classification. Pattern Recognit. 2014;47(1):441–53.CrossRefMATH Abhishek S, Rosha P, Jose P. The C–loss function for pattern classification. Pattern Recognit. 2014;47(1):441–53.CrossRefMATH
17.
Zurück zum Zitat Zhao YP, Tan JF, Wang JJ. C–loss based extreme learning machine for estimating power of small–scale turbojet engine. Aerosp Sci Technol. 2019;89(6):407–19.CrossRef Zhao YP, Tan JF, Wang JJ. C–loss based extreme learning machine for estimating power of small–scale turbojet engine. Aerosp Sci Technol. 2019;89(6):407–19.CrossRef
18.
Zurück zum Zitat Jing TT, Xia HF, and Ding ZM. Adaptively-accumulated knowledge transfer for partial domain adaptation. In Proceedings of the 28th ACM International Conference on Multimedia. 2020;1606–1614. Jing TT, Xia HF, and Ding ZM. Adaptively-accumulated knowledge transfer for partial domain adaptation. In Proceedings of the 28th ACM International Conference on Multimedia. 2020;1606–1614.
19.
Zurück zum Zitat Fu YY, Zhang M, Xu X, et al. Partial feature selection and alignment for multi-source domain adaptation. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition. 2021;16654–16663. Fu YY, Zhang M, Xu X, et al. Partial feature selection and alignment for multi-source domain adaptation. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition. 2021;16654–16663.
20.
Zurück zum Zitat Khalajmehrabadi A, Gatsis N, Pack D. A joint indoor WLAN localization and outlier detection scheme using LASSO and Elastic-Net optimization techniques. IEEE Trans Mob Comput. 2017;16(8):1–1.CrossRef Khalajmehrabadi A, Gatsis N, Pack D. A joint indoor WLAN localization and outlier detection scheme using LASSO and Elastic-Net optimization techniques. IEEE Trans Mob Comput. 2017;16(8):1–1.CrossRef
21.
Zurück zum Zitat Boyd S, Vandenberghe L, Faybusovich L. Convex optimization IEEE Trans Automat Contr. 2006;51(11):1859.CrossRef Boyd S, Vandenberghe L, Faybusovich L. Convex optimization IEEE Trans Automat Contr. 2006;51(11):1859.CrossRef
22.
Zurück zum Zitat Huang GB, Wang DH, Lan Y. Extreme learning machines: a survey. Int J Mach Learn Cyb. 2011;2(2):107–22.CrossRef Huang GB, Wang DH, Lan Y. Extreme learning machines: a survey. Int J Mach Learn Cyb. 2011;2(2):107–22.CrossRef
23.
Zurück zum Zitat Peng HY, Liu CL. Discriminative feature selection via employing smooth and robust hinge loss. IEEE T Neur Net Lear. 2019;99:1–15. Peng HY, Liu CL. Discriminative feature selection via employing smooth and robust hinge loss. IEEE T Neur Net Lear. 2019;99:1–15.
24.
Zurück zum Zitat Lei Z, Mammadov MA. Yearwood J. From convex to nonconvex: a loss function analysis for binary classification. 2010 IEEE International Conference On Data Mining Workshops. 2010;1281–1288. Lei Z, Mammadov MA. Yearwood J. From convex to nonconvex: a loss function analysis for binary classification. 2010 IEEE International Conference On Data Mining Workshops. 2010;1281–1288.
25.
Zurück zum Zitat Hajiabadi H, Molla D, Monsefi R, et al. Combination of loss functions for deep text classification. Int J Mach Learn Cyb. 2019;11:751–61.CrossRef Hajiabadi H, Molla D, Monsefi R, et al. Combination of loss functions for deep text classification. Int J Mach Learn Cyb. 2019;11:751–61.CrossRef
26.
Zurück zum Zitat Hajiabadi H, Monsefi R, Yazdi HS. RELF: robust regression extended with ensemble loss function. Appl Intell. 2018;49:473. Hajiabadi H, Monsefi R, Yazdi HS. RELF: robust regression extended with ensemble loss function. Appl Intell. 2018;49:473.
27.
Zurück zum Zitat Zou H, Hastie T. Addendum: Regularization and variable selection via the elastic net. J Roy Stat Soc. 2010;67(5):768–768.CrossRef Zou H, Hastie T. Addendum: Regularization and variable selection via the elastic net. J Roy Stat Soc. 2010;67(5):768–768.CrossRef
28.
Zurück zum Zitat Golub GH, Loan CFV. Matrix computations 3rd edition. Johns Hopkins studies in mathematical sciences. 1996. Golub GH, Loan CFV. Matrix computations 3rd edition. Johns Hopkins studies in mathematical sciences. 1996.
32.
Zurück zum Zitat Hua XG, Ni YQ, Ko JM, et al. Modeling of temperature–frequency correlation using combined principal component analysis and support vector regression technique. J Comput Civil Eng. 2007;21(2):122–35.CrossRef Hua XG, Ni YQ, Ko JM, et al. Modeling of temperature–frequency correlation using combined principal component analysis and support vector regression technique. J Comput Civil Eng. 2007;21(2):122–35.CrossRef
33.
Zurück zum Zitat Frost P, Kailath T. An innovations approach to least–squares estimation––part III: nonlinear estimation in white Gaussian noise. IEEE Trans Automat Contr. 2003;16(3):217–26.CrossRef Frost P, Kailath T. An innovations approach to least–squares estimation––part III: nonlinear estimation in white Gaussian noise. IEEE Trans Automat Contr. 2003;16(3):217–26.CrossRef
34.
Zurück zum Zitat Demšar J. Statistical comparisons of classifiers over multiple data sets. J Mach Learn Res. 2006;7:1–30.MathSciNetMATH Demšar J. Statistical comparisons of classifiers over multiple data sets. J Mach Learn Res. 2006;7:1–30.MathSciNetMATH
35.
Zurück zum Zitat Iman L, Davenport JM. Approximations of the critical region of the Friedman statistic. Commun Stat–Simul C. 1998;571–595. Iman L, Davenport JM. Approximations of the critical region of the Friedman statistic. Commun Stat–Simul C. 1998;571–595.
36.
Zurück zum Zitat Fei Z, Webb GI, Suraweera P, et al. Subsumption resolution: an efficient and effective technique for semi–naive Bayesian learning. Mach Learn. 2012;87(1):93–125.MathSciNetCrossRefMATH Fei Z, Webb GI, Suraweera P, et al. Subsumption resolution: an efficient and effective technique for semi–naive Bayesian learning. Mach Learn. 2012;87(1):93–125.MathSciNetCrossRefMATH
Metadaten
Titel
C-Loss-Based Doubly Regularized Extreme Learning Machine
verfasst von
Qing Wu
Yan–Lin Fu
Dong–Shun Cui
En Wang
Publikationsdatum
27.08.2022
Verlag
Springer US
Erschienen in
Cognitive Computation / Ausgabe 2/2023
Print ISSN: 1866-9956
Elektronische ISSN: 1866-9964
DOI
https://doi.org/10.1007/s12559-022-10050-2