Skip to main content
Erschienen in: Neural Computing and Applications 7/2012

01.10.2012 | Original Article

Unsupervised maximum margin feature selection via L 2,1-norm minimization

verfasst von: Shizhun Yang, Chenping Hou, Feiping Nie, Yi Wu

Erschienen in: Neural Computing and Applications | Ausgabe 7/2012

Einloggen

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

In this article, we present an unsupervised maximum margin feature selection algorithm via sparse constraints. The algorithm combines feature selection and K-means clustering into a coherent framework. L 2,1-norm regularization is performed to the transformation matrix to enable feature selection across all data samples. Our method is equivalent to solving a convex optimization problem and is an iterative algorithm that converges to an optimal solution. The convergence analysis of our algorithm is also provided. Experimental results demonstrate the efficiency of our algorithm.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Literatur
1.
Zurück zum Zitat Zhao Z, Wang L, Liu H (2010) Efficient spectral feature selection with minimum redundancy. In: Proceedings of the 24th AAAI conference on artificial intelligence (AAAI-10), pp 673–678 Zhao Z, Wang L, Liu H (2010) Efficient spectral feature selection with minimum redundancy. In: Proceedings of the 24th AAAI conference on artificial intelligence (AAAI-10), pp 673–678
2.
Zurück zum Zitat Liu H, Motoda H (1998) Feature selection for knowledge discovery and data mining. Springer, BerlinMATHCrossRef Liu H, Motoda H (1998) Feature selection for knowledge discovery and data mining. Springer, BerlinMATHCrossRef
3.
Zurück zum Zitat Cover TM, Thomas JA (2006) Elements of information theory, 2nd edn. Wiley-Interscience, HobokenMATH Cover TM, Thomas JA (2006) Elements of information theory, 2nd edn. Wiley-Interscience, HobokenMATH
4.
Zurück zum Zitat Duda RO, Hart PE, Stork DG (2000) Pattern classification, 2nd edn. Wiley-Interscience, Hoboken Duda RO, Hart PE, Stork DG (2000) Pattern classification, 2nd edn. Wiley-Interscience, Hoboken
5.
Zurück zum Zitat Rodgers JL, Nicewander WA (1988) Thirteen ways to look at the correlation coefficient. Am Stat 42(1):59–66CrossRef Rodgers JL, Nicewander WA (1988) Thirteen ways to look at the correlation coefficient. Am Stat 42(1):59–66CrossRef
6.
Zurück zum Zitat Zhao Z, Liu H (2007) Spectral feature selection for supervised and unsupervised learning. In: Proceedings of the 24th international conference on machine learning (ICML-07), pp 1151–1157 Zhao Z, Liu H (2007) Spectral feature selection for supervised and unsupervised learning. In: Proceedings of the 24th international conference on machine learning (ICML-07), pp 1151–1157
7.
Zurück zum Zitat Zhao Z, Liu H (2007) Semi-supervised feature selection via spectral analysis. In: Proceedings of the 7th SIAM international conference on data mining (SDM-07), pp 641–646 Zhao Z, Liu H (2007) Semi-supervised feature selection via spectral analysis. In: Proceedings of the 7th SIAM international conference on data mining (SDM-07), pp 641–646
8.
Zurück zum Zitat Cai D, Zhang C, He X (2010) Unsupervised feature selection for multi-cluster data. In: Proceedings of the 16th ACM SIGKDD international conference on knowledge discovery and data mining (KDD-10), pp 333–342 Cai D, Zhang C, He X (2010) Unsupervised feature selection for multi-cluster data. In: Proceedings of the 16th ACM SIGKDD international conference on knowledge discovery and data mining (KDD-10), pp 333–342
9.
Zurück zum Zitat Krzanowski WJ (1987) Selection of variables to preserve multivariate data structure, Using Principal Component Analysis. Appl Stat J R Stat Soc Ser C 36:22–33CrossRef Krzanowski WJ (1987) Selection of variables to preserve multivariate data structure, Using Principal Component Analysis. Appl Stat J R Stat Soc Ser C 36:22–33CrossRef
10.
Zurück zum Zitat He X, Cai D, Niyogi P (2006) Laplacian score for feature selection. In: Proceedings of the annual conference on advances in neural information processing systems (NIPS-06) He X, Cai D, Niyogi P (2006) Laplacian score for feature selection. In: Proceedings of the annual conference on advances in neural information processing systems (NIPS-06)
11.
Zurück zum Zitat Boutsidis C, Mahoney MW, Drineas P (2009) Unsupervised feature selection for the k-means clustering problem. In: Proceedings of the annual conference on advances in neural information processing systems (NIPS-09) Boutsidis C, Mahoney MW, Drineas P (2009) Unsupervised feature selection for the k-means clustering problem. In: Proceedings of the annual conference on advances in neural information processing systems (NIPS-09)
12.
Zurück zum Zitat Li H, Jiang T, Zhang K (2006) Efficient and robust feature extraction by maximum margin criterion. IEEE Trans Neural Netw 17(1):157–165CrossRef Li H, Jiang T, Zhang K (2006) Efficient and robust feature extraction by maximum margin criterion. IEEE Trans Neural Netw 17(1):157–165CrossRef
13.
Zurück zum Zitat Ding C, Zhou D, He X, Zha H (2006) R1-PCA: rotational invariant L1-norm principal component analysis for robust subspace factorization. In: Proceedings of the 23rd international conference on machine learning (ICML-06), pp 281–288 Ding C, Zhou D, He X, Zha H (2006) R1-PCA: rotational invariant L1-norm principal component analysis for robust subspace factorization. In: Proceedings of the 23rd international conference on machine learning (ICML-06), pp 281–288
14.
Zurück zum Zitat Argyriou A, Evgeniou T, Pontil M (2007) Multi-task feature learning. In: Proceedings of the annual conference on advances in neural information processing systems (NIPS-07) Argyriou A, Evgeniou T, Pontil M (2007) Multi-task feature learning. In: Proceedings of the annual conference on advances in neural information processing systems (NIPS-07)
15.
Zurück zum Zitat Obozinski G, Taskar B, Jordan M (2006) Multi-task feature selection. Technical report, Department of Statistics, University of California, Berkeley Obozinski G, Taskar B, Jordan M (2006) Multi-task feature selection. Technical report, Department of Statistics, University of California, Berkeley
16.
Zurück zum Zitat Liu J, Ji S, Ye J (2009) Multi-task feature learning via efficient L2,1-norm minimization. In: Proceedings of the twenty-fifth conference on uncertainty in artificial intelligence (UAI-09) Liu J, Ji S, Ye J (2009) Multi-task feature learning via efficient L2,1-norm minimization. In: Proceedings of the twenty-fifth conference on uncertainty in artificial intelligence (UAI-09)
17.
Zurück zum Zitat Nie F, Huang H, Cai X, Ding C (2010) Efficient and robust feature selection via joint l2,1-norms minimization. In: Proceedings of the annual conference on advances in neural information processing systems (NIPS-10) Nie F, Huang H, Cai X, Ding C (2010) Efficient and robust feature selection via joint l2,1-norms minimization. In: Proceedings of the annual conference on advances in neural information processing systems (NIPS-10)
Metadaten
Titel
Unsupervised maximum margin feature selection via L 2,1-norm minimization
verfasst von
Shizhun Yang
Chenping Hou
Feiping Nie
Yi Wu
Publikationsdatum
01.10.2012
Verlag
Springer-Verlag
Erschienen in
Neural Computing and Applications / Ausgabe 7/2012
Print ISSN: 0941-0643
Elektronische ISSN: 1433-3058
DOI
https://doi.org/10.1007/s00521-012-0827-3

Weitere Artikel der Ausgabe 7/2012

Neural Computing and Applications 7/2012 Zur Ausgabe

Premium Partner