Skip to main content
Erschienen in: Neural Computing and Applications 11/2020

14.03.2019 | Multi-Source Data Understanding (MSDU)

Unsupervised feature selection based on joint spectral learning and general sparse regression

verfasst von: Tao Chen, Yanrong Guo, Shijie Hao

Erschienen in: Neural Computing and Applications | Ausgabe 11/2020

Einloggen

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

Unsupervised feature selection is an important machine learning task since the manual annotated data are dramatically expensive to obtain and therefore very limited. However, due to the existence of noise and outliers in different data samples, feature selection without the help of discriminant information embedded in the annotated data is quite challenging. To relieve these limitations, we investigate the embedding of spectral learning into a general sparse regression framework for unsupervised feature selection. Generally, the proposed general spectral sparse regression (GSSR) method handles the outlier features by learning the joint sparsity and the noisy features by preserving the local structures of data, jointly. Specifically, GSSR is conducted in two stages. First, the classic sparse dictionary learning method is used to build the bases of original data. After that, the original data are project to the basis space by learning a new representation via GSSR. In GSSR, robust loss function \(\ell _{2,r}{-}{norm}(0<r\le 2)\) and \(\ell _{2,p}-{norm}(0<p\le 1)\) instead of the traditional F norm and least square loss function are simultaneously considered as the reconstruction term and sparse regularization term for sparse regression. Furthermore, the local topological structures of the new representations are preserved by spectral learning based on the Laplacian term. The overall objective function in GSSR is optimized and proved to be converging. Experimental results on several publicly datasets have demonstrated the validity of our algorithm, which outperformed the state-of-the-art feature selections in terms of classification performance.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Literatur
1.
Zurück zum Zitat Zhang S, Li X, Zong M et al (2018) Efficient knn classification with different numbers of nearest neighbors. IEEE Trans Neural Netw Learn Syst 29(5):1774–1785MathSciNet Zhang S, Li X, Zong M et al (2018) Efficient knn classification with different numbers of nearest neighbors. IEEE Trans Neural Netw Learn Syst 29(5):1774–1785MathSciNet
2.
Zurück zum Zitat Cong L, Xiaofeng Z (2017) Unsupervised feature selection via local structure learning and sparse learning. Multimed Tools Appl 77:1–18 Cong L, Xiaofeng Z (2017) Unsupervised feature selection via local structure learning and sparse learning. Multimed Tools Appl 77:1–18
3.
Zurück zum Zitat Yu J, Gao X, Tao D et al (2014) A unified learning framework for single image super-resolution. IEEE Trans Neural Netw Learn Syst 25(4):780–792 Yu J, Gao X, Tao D et al (2014) A unified learning framework for single image super-resolution. IEEE Trans Neural Netw Learn Syst 25(4):780–792
4.
Zurück zum Zitat Dash M, Liu H (2000) Feature selection for clustering. In: Pacific-Asia conference on knowledge discovery and data mining, Springer, Berlin, pp 110–121 Dash M, Liu H (2000) Feature selection for clustering. In: Pacific-Asia conference on knowledge discovery and data mining, Springer, Berlin, pp 110–121
5.
Zurück zum Zitat Nie F, Xiang S, Jia Y et al (2008) Trace ratio criterion for feature selection. AAAI 2:671–676 Nie F, Xiang S, Jia Y et al (2008) Trace ratio criterion for feature selection. AAAI 2:671–676
6.
Zurück zum Zitat Guyon I, Elisseeff A (2003) An introduction to variable and feature selection. J Mach Learn Res 3:1157–1182 (Mar)MATH Guyon I, Elisseeff A (2003) An introduction to variable and feature selection. J Mach Learn Res 3:1157–1182 (Mar)MATH
7.
Zurück zum Zitat Cai D, Zhang C, He X (2010) Unsupervised feature selection for multi-cluster data. In: Proceedings of the 16th ACM SIGKDD international conference on knowledge discovery and data mining, ACM, pp 333–342 Cai D, Zhang C, He X (2010) Unsupervised feature selection for multi-cluster data. In: Proceedings of the 16th ACM SIGKDD international conference on knowledge discovery and data mining, ACM, pp 333–342
8.
Zurück zum Zitat Dy JG (2008) Unsupervised feature selection. In: Computational methods of feature selection, pp 19–39 Dy JG (2008) Unsupervised feature selection. In: Computational methods of feature selection, pp 19–39
9.
Zurück zum Zitat Mitra P, Murthy CA, Pal SK (2002) Unsupervised feature selection using feature similarity. IEEE Trans Pattern Anal Mach Intell 24(3):301–312 Mitra P, Murthy CA, Pal SK (2002) Unsupervised feature selection using feature similarity. IEEE Trans Pattern Anal Mach Intell 24(3):301–312
10.
Zurück zum Zitat Li Z, Yang Y, Liu J et al (2012) Unsupervised feature selection using nonnegative spectral analysis. In: AAAI, pp 1026–1032 Li Z, Yang Y, Liu J et al (2012) Unsupervised feature selection using nonnegative spectral analysis. In: AAAI, pp 1026–1032
11.
Zurück zum Zitat Hou C, Nie F, Li X et al (2014) Joint embedding learning and sparse regression: a framework for unsupervised feature selection. IEEE Trans Cybern 44(6):793–804 Hou C, Nie F, Li X et al (2014) Joint embedding learning and sparse regression: a framework for unsupervised feature selection. IEEE Trans Cybern 44(6):793–804
12.
Zurück zum Zitat Hou C, Nie F, Yi D et al (2011) Feature selection via joint embedding learning and sparse regression. In: IJCAI Hou C, Nie F, Yi D et al (2011) Feature selection via joint embedding learning and sparse regression. In: IJCAI
13.
Zurück zum Zitat Yuan M, Lin Y (2006) Model selection and estimation in regression with grouped variables. J R Stat Soc Ser B (Stat Methodol) 68(1):49–67MathSciNetMATH Yuan M, Lin Y (2006) Model selection and estimation in regression with grouped variables. J R Stat Soc Ser B (Stat Methodol) 68(1):49–67MathSciNetMATH
14.
Zurück zum Zitat Peng H, Fan Y (2017) A general framework for sparsity regularized feature selection via iteratively reweighted least square minimization. In: AAAI Peng H, Fan Y (2017) A general framework for sparsity regularized feature selection via iteratively reweighted least square minimization. In: AAAI
15.
Zurück zum Zitat Peng H, Fan Y (2016) Direct sparsity optimization based feature selection for multi-class classification. In: IJCAI Peng H, Fan Y (2016) Direct sparsity optimization based feature selection for multi-class classification. In: IJCAI
16.
Zurück zum Zitat Zhu X, Li X, Zhang S et al (2017) Robust joint graph sparse coding for unsupervised spectral feature selection. IEEE Trans Neural Netw Learn Syst 28(6):1263–1275MathSciNet Zhu X, Li X, Zhang S et al (2017) Robust joint graph sparse coding for unsupervised spectral feature selection. IEEE Trans Neural Netw Learn Syst 28(6):1263–1275MathSciNet
17.
Zurück zum Zitat Wang Y, Wu L, Lin X et al (2018) Multiview spectral clustering via structured low-rank matrix factorization. IEEE Trans Neural Netw Learn Syst 29:4833–4843 Wang Y, Wu L, Lin X et al (2018) Multiview spectral clustering via structured low-rank matrix factorization. IEEE Trans Neural Netw Learn Syst 29:4833–4843
18.
Zurück zum Zitat Wang Y, Lin X, Wu L et al (2015) Robust subspace clustering for multi-view data by exploiting correlation consensus. IEEE Trans Image Process 24(11):3939–3949MathSciNetMATH Wang Y, Lin X, Wu L et al (2015) Robust subspace clustering for multi-view data by exploiting correlation consensus. IEEE Trans Image Process 24(11):3939–3949MathSciNetMATH
19.
Zurück zum Zitat Zhao Z, Wang L, Liu H (2010) Efficient spectral feature selection with minimum redundancy. In: AAAI, pp 673–678 Zhao Z, Wang L, Liu H (2010) Efficient spectral feature selection with minimum redundancy. In: AAAI, pp 673–678
20.
Zurück zum Zitat Kohavi R, John GH (1997) Wrappers for feature subset selection. Artif Intell 97(1–2):273–324MATH Kohavi R, John GH (1997) Wrappers for feature subset selection. Artif Intell 97(1–2):273–324MATH
21.
Zurück zum Zitat Wang Y, Wu L (2018) Beyond low-rank representations: orthogonal clustering basis reconstruction with optimized graph structure for multi-view spectral clustering. Neural Netw 103:1–8MATH Wang Y, Wu L (2018) Beyond low-rank representations: orthogonal clustering basis reconstruction with optimized graph structure for multi-view spectral clustering. Neural Netw 103:1–8MATH
22.
Zurück zum Zitat Constantinopoulos C, Titsias MK, Likas A (2006) Bayesian feature and model selection for Gaussian mixture models. IEEE Trans Pattern Anal Mach Intell 6:1013–1018 Constantinopoulos C, Titsias MK, Likas A (2006) Bayesian feature and model selection for Gaussian mixture models. IEEE Trans Pattern Anal Mach Intell 6:1013–1018
23.
Zurück zum Zitat Wang Y, Lin X, Wu L et al (2017) Effective multi-query expansions: collaborative deep networks for robust landmark retrieval. arXiv preprint arXiv:1701.05003 Wang Y, Lin X, Wu L et al (2017) Effective multi-query expansions: collaborative deep networks for robust landmark retrieval. arXiv preprint arXiv:​1701.​05003
24.
Zurück zum Zitat Parthalin NM, Jensen R (2010) Measures for unsupervised fuzzy-rough feature selection. Int J Hybrid Intell Syst 7(4):249–259MATH Parthalin NM, Jensen R (2010) Measures for unsupervised fuzzy-rough feature selection. Int J Hybrid Intell Syst 7(4):249–259MATH
25.
Zurück zum Zitat Zhu X, Wu X, Ding W, et al (2013) Feature selection by joint graph sparse coding. In: Proceedings of the 2013 SIAM international conference on data mining. Society for industrial and applied mathematics, pp 803–811 Zhu X, Wu X, Ding W, et al (2013) Feature selection by joint graph sparse coding. In: Proceedings of the 2013 SIAM international conference on data mining. Society for industrial and applied mathematics, pp 803–811
26.
Zurück zum Zitat Wu L, Wang Y, Gao J et al (2018) Deep adaptive feature embedding with local sample distributions for person re-identification. Pattern Recognit 73:275–288 Wu L, Wang Y, Gao J et al (2018) Deep adaptive feature embedding with local sample distributions for person re-identification. Pattern Recognit 73:275–288
27.
Zurück zum Zitat Liu J, Ji S, Ye J (2009) Multi-task feature learning via efficient \(\ell _{2,1}-norm\) minimization. In: Proceedings of the twenty-fifth conference on uncertainty in artificial intelligence, AUAI Press, pp 339–348 Liu J, Ji S, Ye J (2009) Multi-task feature learning via efficient \(\ell _{2,1}-norm\) minimization. In: Proceedings of the twenty-fifth conference on uncertainty in artificial intelligence, AUAI Press, pp 339–348
28.
Zurück zum Zitat Chartrand R (2007) Exact reconstruction of sparse signals via nonconvex minimization. IEEE Signal Process Lett 14(10):707–710 Chartrand R (2007) Exact reconstruction of sparse signals via nonconvex minimization. IEEE Signal Process Lett 14(10):707–710
29.
Zurück zum Zitat Zeng J, Lin S, Wang Y et al (2014) \(L_ 1/2\) Regularization: convergence of iterative half thresholding algorithm. IEEE Trans Signal Process 62(9):2317–2329MathSciNetMATH Zeng J, Lin S, Wang Y et al (2014) \(L_ 1/2\) Regularization: convergence of iterative half thresholding algorithm. IEEE Trans Signal Process 62(9):2317–2329MathSciNetMATH
30.
Zurück zum Zitat Shao L, Yan R, Li X et al (2014) From heuristic optimization to dictionary learning: a review and comprehensive comparison of image denoising algorithms. IEEE Trans Cybern 44(7):1001–1013 Shao L, Yan R, Li X et al (2014) From heuristic optimization to dictionary learning: a review and comprehensive comparison of image denoising algorithms. IEEE Trans Cybern 44(7):1001–1013
31.
Zurück zum Zitat Mairal J, Bach F, Ponce J et al (2010) Online learning for matrix factorization and sparse coding. J Mach Learn Res 11:19–60 (Jan)MathSciNetMATH Mairal J, Bach F, Ponce J et al (2010) Online learning for matrix factorization and sparse coding. J Mach Learn Res 11:19–60 (Jan)MathSciNetMATH
32.
Zurück zum Zitat Mairal J, Bach F, Ponce J et al (2009) Online dictionary learning for sparse coding. In: Proceedings of the 26th annual international conference on machine learning, ACM, pp 689–696 Mairal J, Bach F, Ponce J et al (2009) Online dictionary learning for sparse coding. In: Proceedings of the 26th annual international conference on machine learning, ACM, pp 689–696
34.
Zurück zum Zitat Yang Y, Shen H T, Ma Z, et al (2011) \(\ell _{2,1}-Norm\) regularized discriminative feature selection for unsupervised learning. In: IJCAI proceedings-international joint conference on artificial intelligence, vol 22, no 1, p 1589 Yang Y, Shen H T, Ma Z, et al (2011) \(\ell _{2,1}-Norm\) regularized discriminative feature selection for unsupervised learning. In: IJCAI proceedings-international joint conference on artificial intelligence, vol 22, no 1, p 1589
36.
Zurück zum Zitat Zhu X, Zhang S, Hu R et al (2018) Local and global structure preservation for robust unsupervised spectral feature selection. IEEE Trans Knowl Data Eng 30(3):517–529 Zhu X, Zhang S, Hu R et al (2018) Local and global structure preservation for robust unsupervised spectral feature selection. IEEE Trans Knowl Data Eng 30(3):517–529
37.
Zurück zum Zitat Zheng W, Zhu X, Zhu Y et al (2018) Dynamic graph learning for spectral feature selection. Multimed Tools Appl 77(22):29739–29755 Zheng W, Zhu X, Zhu Y et al (2018) Dynamic graph learning for spectral feature selection. Multimed Tools Appl 77(22):29739–29755
38.
Zurück zum Zitat Geng B, Tao D, Xu C et al (2012) Ensemble manifold regularization. IEEE Trans Pattern Anal Mach Intell 34(6):1227–1233 Geng B, Tao D, Xu C et al (2012) Ensemble manifold regularization. IEEE Trans Pattern Anal Mach Intell 34(6):1227–1233
39.
Zurück zum Zitat Zhang Q, Zhang L, Zhang L et al (2015) Ensemble manifold regularized sparse low-rank approximation for multiview feature embedding. Pattern Recognit 48(10):3102–3112 Zhang Q, Zhang L, Zhang L et al (2015) Ensemble manifold regularized sparse low-rank approximation for multiview feature embedding. Pattern Recognit 48(10):3102–3112
40.
Zurück zum Zitat Nie F, Huang H, Cai X, et al (2010) Efficient and robust feature selection via joint \(\ell _{2,1}-norm\) minimization. In: Advances in neural information processing systems, pp 1813-1821 Nie F, Huang H, Cai X, et al (2010) Efficient and robust feature selection via joint \(\ell _{2,1}-norm\) minimization. In: Advances in neural information processing systems, pp 1813-1821
42.
Zurück zum Zitat Zhu X, Li X, Zhang S, Zongben X, Litao Y, Wang C (2017) Graph PCA hashing for similarity search. IEEE Trans Multimed 19(9):2033–2044 Zhu X, Li X, Zhang S, Zongben X, Litao Y, Wang C (2017) Graph PCA hashing for similarity search. IEEE Trans Multimed 19(9):2033–2044
Metadaten
Titel
Unsupervised feature selection based on joint spectral learning and general sparse regression
verfasst von
Tao Chen
Yanrong Guo
Shijie Hao
Publikationsdatum
14.03.2019
Verlag
Springer London
Erschienen in
Neural Computing and Applications / Ausgabe 11/2020
Print ISSN: 0941-0643
Elektronische ISSN: 1433-3058
DOI
https://doi.org/10.1007/s00521-019-04117-9

Weitere Artikel der Ausgabe 11/2020

Neural Computing and Applications 11/2020 Zur Ausgabe

Brain inspired Computing & Machine Learning Applied Research-BISMLARE

A deep learning classifier for sentence classification in biomedical and computer science abstracts

Premium Partner