Skip to main content
Top
Published in: International Journal of Machine Learning and Cybernetics 5/2024

24-11-2023 | Original Article

Hessian unsupervised extreme learning machine

Authors: Sharana Dharshikgan Suresh Dass, Ganesh Krishnasamy, Raveendran Paramesran, Raphaël C.-W. Phan

Published in: International Journal of Machine Learning and Cybernetics | Issue 5/2024

Log in

Activate our intelligent search to find suitable subject content or patents.

search-config
loading …

Abstract

Extreme learning machines (ELMs) are shown to be efficient and effective learning algorithms for regression and classification tasks. ELMs, however, are typically utilized to solve supervised learning problems. Only a handful of research on ELMs focuses on exploring unlabeled data. One representative work is the unsupervised extreme learning machine (US-ELM), where the standard ELM is expanded for unsupervised learning based on Laplacian regularization. However, Laplacian regularization has poor extrapolation power since it tends to bias the solution towards a constant function. In this paper, we propose a new framework termed Hessian unsupervised ELM (HUS-ELM) to enhance the unsupervised learning of ELM. In particular, Hessian regularization can properly exploit the intrinsic local geometry of the data manifold compared to Laplacian regularization. This leverages the performance of HUS-ELM in unsupervised learning problems since the Hessian regularization can correctly reflect the positional relationship between the unlabeled samples. Six publicly available datasets are used to evaluate the proposed algorithm. The experimental results indicate that the proposed method performs better than other unsupervised learning methods in terms of clustering accuracy.

Dont have a licence yet? Then find out more about our products and how to get one now:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Show more products
Literature
1.
go back to reference Hagan MT, Menhaj MB (1994) Training feedforward networks with the marquardt algorithm. IEEE Trans Neural Network 5(6):989–993CrossRef Hagan MT, Menhaj MB (1994) Training feedforward networks with the marquardt algorithm. IEEE Trans Neural Network 5(6):989–993CrossRef
2.
go back to reference Rumelhart DE, Hinton GE, Williams RJ (1986) Learning representations by back-propagating errors. Nature 323(6088):533–536CrossRef Rumelhart DE, Hinton GE, Williams RJ (1986) Learning representations by back-propagating errors. Nature 323(6088):533–536CrossRef
3.
go back to reference Zhang L, Suganthan PN (2016) A comprehensive evaluation of random vector functional link networks. Information sciences 367:1094–1105CrossRef Zhang L, Suganthan PN (2016) A comprehensive evaluation of random vector functional link networks. Information sciences 367:1094–1105CrossRef
4.
go back to reference Zhang P-B, Yang Z-X (2020) A new learning paradigm for random vector functional-link network: Rvfl+. Neural Networks 122:94–105CrossRef Zhang P-B, Yang Z-X (2020) A new learning paradigm for random vector functional-link network: Rvfl+. Neural Networks 122:94–105CrossRef
5.
go back to reference Guo P, Lyu MR (2004) A pseudoinverse learning algorithm for feedforward neural networks with stacked generalization applications to software reliability growth data. Neurocomputing 56:101–121CrossRef Guo P, Lyu MR (2004) A pseudoinverse learning algorithm for feedforward neural networks with stacked generalization applications to software reliability growth data. Neurocomputing 56:101–121CrossRef
8.
go back to reference Bartlett P. (1998) Thesamplecomplexityofp atternclassification withneuralnetworks: Thesizeoftheweightsismo reimportantthan thesizeofthenetwork. IEEETrans. Inf. Theory 44(2) Bartlett P. (1998) Thesamplecomplexityofp atternclassification withneuralnetworks: Thesizeoftheweightsismo reimportantthan thesizeofthenetwork. IEEETrans. Inf. Theory 44(2)
10.
go back to reference Jamei M, Ahmadianfar I, Karbasi M, Malik A, Kisi O, Yaseen ZM (2023) Development of wavelet-based kalman online sequential extreme learning machine optimized with boruta-random forest for drought index forecasting. Eng Appl Artificial Intellig 117:105545CrossRef Jamei M, Ahmadianfar I, Karbasi M, Malik A, Kisi O, Yaseen ZM (2023) Development of wavelet-based kalman online sequential extreme learning machine optimized with boruta-random forest for drought index forecasting. Eng Appl Artificial Intellig 117:105545CrossRef
11.
go back to reference Zhou X, Huang J, Lu F, Zhou W, Liu P (2023) A novel compound fault-tolerant method based on online sequential extreme learning machine with cycle reservoir for turbofan engine direct thrust control. Aeros Sci Tech 132:108059CrossRef Zhou X, Huang J, Lu F, Zhou W, Liu P (2023) A novel compound fault-tolerant method based on online sequential extreme learning machine with cycle reservoir for turbofan engine direct thrust control. Aeros Sci Tech 132:108059CrossRef
12.
go back to reference Mao W, Wang J, Xue Z (2017) An elm-based model with sparse-weighting strategy for sequential data imbalance problem. Int J Machine Learn Cyber 8:1333–1345CrossRef Mao W, Wang J, Xue Z (2017) An elm-based model with sparse-weighting strategy for sequential data imbalance problem. Int J Machine Learn Cyber 8:1333–1345CrossRef
14.
go back to reference Bhatia A, Chug A, Prakash Singh A (2020) Application of extreme learning machine in plant disease prediction for highly imbalanced dataset. J Statistics Manag Syst 23(6):1059–1068 Bhatia A, Chug A, Prakash Singh A (2020) Application of extreme learning machine in plant disease prediction for highly imbalanced dataset. J Statistics Manag Syst 23(6):1059–1068
15.
go back to reference He Q, Jin X, Du C, Zhuang F, Shi Z (2014) Clustering in extreme learning machine feature space. Neurocomputing 128:88–95CrossRef He Q, Jin X, Du C, Zhuang F, Shi Z (2014) Clustering in extreme learning machine feature space. Neurocomputing 128:88–95CrossRef
16.
go back to reference Liu T, Lekamalage CKL, Huang G-B, Lin Z (2018) Extreme learning machine for joint embedding and clustering. Neurocomputing 277:78–88CrossRef Liu T, Lekamalage CKL, Huang G-B, Lin Z (2018) Extreme learning machine for joint embedding and clustering. Neurocomputing 277:78–88CrossRef
17.
go back to reference Chen J, Zeng Y, Li Y, Huang G-B (2020) Unsupervised feature selection based extreme learning machine for clustering. Neurocomputing 386:198–207CrossRef Chen J, Zeng Y, Li Y, Huang G-B (2020) Unsupervised feature selection based extreme learning machine for clustering. Neurocomputing 386:198–207CrossRef
18.
go back to reference Hsu Y-S, Lin S-J (2016) An emerging hybrid mechanism for information disclosure forecasting. Int J Machine Learn Cyber 7:943–952CrossRef Hsu Y-S, Lin S-J (2016) An emerging hybrid mechanism for information disclosure forecasting. Int J Machine Learn Cyber 7:943–952CrossRef
19.
go back to reference Baig MM, Awais MM, El-Alfy E-SM (2017) Adaboost-based artificial neural network learning. Neurocomputing 248:120–126CrossRef Baig MM, Awais MM, El-Alfy E-SM (2017) Adaboost-based artificial neural network learning. Neurocomputing 248:120–126CrossRef
21.
go back to reference Yang L, Yang S, Li S, Liu Z, Jiao L (2017) Incremental laplacian regularization extreme learning machine for online learning. Appl Soft Compu 59:546–555CrossRef Yang L, Yang S, Li S, Liu Z, Jiao L (2017) Incremental laplacian regularization extreme learning machine for online learning. Appl Soft Compu 59:546–555CrossRef
22.
go back to reference Liu M, Liu B, Zhang C, Wang W, Sun W (2017) Semi-supervised low rank kernel learning algorithm via extreme learning machine. Int J Machine Learn Cyber 8:1039–1052CrossRef Liu M, Liu B, Zhang C, Wang W, Sun W (2017) Semi-supervised low rank kernel learning algorithm via extreme learning machine. Int J Machine Learn Cyber 8:1039–1052CrossRef
23.
go back to reference Yao L, Ge Z (2017) Deep learning of semisupervised process data with hierarchical extreme learning machine and soft sensor application. IEEE Trans Indust Electronics 65(2):1490–1498CrossRef Yao L, Ge Z (2017) Deep learning of semisupervised process data with hierarchical extreme learning machine and soft sensor application. IEEE Trans Indust Electronics 65(2):1490–1498CrossRef
29.
go back to reference Donoho DL, Grimes C (2003) Hessian eigenmaps: Locally linear embedding techniques for high-dimensional data. Proceedings of the National Academy of Sciences 100(10), 5591–5596. https://www.pnas.org/doi/pdf/10.1073/pnas.1031596100. https://doi.org/10.1073/pnas.1031596100 Donoho DL, Grimes C (2003) Hessian eigenmaps: Locally linear embedding techniques for high-dimensional data. Proceedings of the National Academy of Sciences 100(10), 5591–5596. https://​www.​pnas.​org/​doi/​pdf/​10.​1073/​pnas.​1031596100.​ https://​doi.​org/​10.​1073/​pnas.​1031596100
32.
go back to reference Huang G-B, Chen L, Siew CK et al (2006) Universal approximation using incremental constructive feedforward networks with random hidden nodes. IEEE Trans. Neural Networks 17(4):879–892CrossRef Huang G-B, Chen L, Siew CK et al (2006) Universal approximation using incremental constructive feedforward networks with random hidden nodes. IEEE Trans. Neural Networks 17(4):879–892CrossRef
33.
go back to reference Huang G-B, Chen L (2007) Convex incremental extreme learning machine. Neurocomputing 70(16–18):3056–3062CrossRef Huang G-B, Chen L (2007) Convex incremental extreme learning machine. Neurocomputing 70(16–18):3056–3062CrossRef
34.
go back to reference Eells J, Lemaire L (1983) Selected Topics in Harmonic Maps vol. 50. American Mathematical Soc Eells J, Lemaire L (1983) Selected Topics in Harmonic Maps vol. 50. American Mathematical Soc
35.
go back to reference Belkin M, Niyogi P (2003) Laplacian eigenmaps for dimensionality reduction and data representation. Neural Computation 15(6):1373–1396CrossRef Belkin M, Niyogi P (2003) Laplacian eigenmaps for dimensionality reduction and data representation. Neural Computation 15(6):1373–1396CrossRef
36.
go back to reference Lutkepohl H (1997) Handbook of matrices. Computa Statistics Data analysis 2(25):243 Lutkepohl H (1997) Handbook of matrices. Computa Statistics Data analysis 2(25):243
37.
go back to reference Hartigan JA, Wong MA et al (1979) A k-means clustering algorithm. Applied statistics 28(1):100–108CrossRef Hartigan JA, Wong MA et al (1979) A k-means clustering algorithm. Applied statistics 28(1):100–108CrossRef
38.
go back to reference Ng A, Jordan M, Weiss Y (2001) On spectral clustering: Analysis and an algorithm. Advances in neural information processing systems 14 Ng A, Jordan M, Weiss Y (2001) On spectral clustering: Analysis and an algorithm. Advances in neural information processing systems 14
39.
go back to reference Elhamifar E, Vidal R (2013) Sparse subspace clustering: Algorithm, theory, and applications. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(11):2765–2781CrossRef Elhamifar E, Vidal R (2013) Sparse subspace clustering: Algorithm, theory, and applications. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(11):2765–2781CrossRef
40.
go back to reference Ding S, Zhang N, Zhang J, Xu X, Shi Z (2017) Unsupervised extreme learning machine with representational features. Int J Machine Learn Cyber 8(2):587–595CrossRef Ding S, Zhang N, Zhang J, Xu X, Shi Z (2017) Unsupervised extreme learning machine with representational features. Int J Machine Learn Cyber 8(2):587–595CrossRef
41.
go back to reference Papadimitriou CH, Steiglitz K (1998) Combinatorial Optimization: Algorithms and Complexity. Courier Corporation Papadimitriou CH, Steiglitz K (1998) Combinatorial Optimization: Algorithms and Complexity. Courier Corporation
42.
go back to reference Yang Y, Shen H, Nie F, Ji R, Zhou X (2011) Nonnegative spectral clustering with discriminative regularization. Proceedings of the AAAI Conference on Artificial Intelligence 25:555–560 Yang Y, Shen H, Nie F, Ji R, Zhou X (2011) Nonnegative spectral clustering with discriminative regularization. Proceedings of the AAAI Conference on Artificial Intelligence 25:555–560
Metadata
Title
Hessian unsupervised extreme learning machine
Authors
Sharana Dharshikgan Suresh Dass
Ganesh Krishnasamy
Raveendran Paramesran
Raphaël C.-W. Phan
Publication date
24-11-2023
Publisher
Springer Berlin Heidelberg
Published in
International Journal of Machine Learning and Cybernetics / Issue 5/2024
Print ISSN: 1868-8071
Electronic ISSN: 1868-808X
DOI
https://doi.org/10.1007/s13042-023-02012-3

Other articles of this Issue 5/2024

International Journal of Machine Learning and Cybernetics 5/2024 Go to the issue