Skip to main content
Top
Published in: Neural Computing and Applications 2/2013

01-08-2013 | Original Article

An adaptive class pairwise dimensionality reduction algorithm

Authors: Lifang He, Xiaowei Yang, Zhifeng Hao

Published in: Neural Computing and Applications | Issue 2/2013

Log in

Activate our intelligent search to find suitable subject content or patents.

search-config
loading …

Abstract

Support vector machines (SVM) has achieved great success in multi-class classification. However, with the increase in dimension, the irrelevant or redundant features may degrade the generalization performances of the SVM classifiers, which make dimensionality reduction (DR) become indispensable for high-dimensional data. At present, most of the DR algorithms reduce all data points to the same dimension for multi-class datasets, or search the local latent dimension for each class, but they neglect the fact that different class pairs also have different local latent dimensions. In this paper, we propose an adaptive class pairwise dimensionality reduction algorithm (ACPDR) to improve the generalization performances of the multi-class SVM classifiers. In the proposed algorithm, on the one hand, different class pairs are reduced to different dimensions; on the other hand, a tabu strategy is adopted to select adaptively a suitable embedding dimension. Five popular DR algorithms are employed in our experiment, and the numerical results on some benchmark multi-class datasets show that compared with the traditional DR algorithms, the proposed ACPDR can improve the generalization performances of the multi-class SVM classifiers, and also verify that it is reasonable to consider the different class pairs have different local dimensions.

Dont have a licence yet? Then find out more about our products and how to get one now:

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Literature
1.
go back to reference Vapnik V (1998) Statistical learning theory. Wiley-Interscience, New YorkMATH Vapnik V (1998) Statistical learning theory. Wiley-Interscience, New YorkMATH
2.
go back to reference Gidudu A, Ruther H (2007) Comparison of feature selection techniques for SVM classification. In: Schaepman ME, Liang S, Groot NE, Kneubühler M (eds) Proceedings of 10th international symposium on physical measurements and spectral signatures in remote sensing, vol XXXVI. International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Davos, Switzerland, pp 258–263 Gidudu A, Ruther H (2007) Comparison of feature selection techniques for SVM classification. In: Schaepman ME, Liang S, Groot NE, Kneubühler M (eds) Proceedings of 10th international symposium on physical measurements and spectral signatures in remote sensing, vol XXXVI. International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Davos, Switzerland, pp 258–263
3.
go back to reference Pal M, Foody GM (2010) Feature selection for classification of hyperspectral data by SVM. IEEE Trans Geosci Remote Sens 5:2297–2306CrossRef Pal M, Foody GM (2010) Feature selection for classification of hyperspectral data by SVM. IEEE Trans Geosci Remote Sens 5:2297–2306CrossRef
4.
go back to reference Yu L, Liu H (2003) Feature selection for high-dimensional data: a fast correlation based filter solution. In: Proceedings of the twelfth International Conference on Machine Learning (ICML) Yu L, Liu H (2003) Feature selection for high-dimensional data: a fast correlation based filter solution. In: Proceedings of the twelfth International Conference on Machine Learning (ICML)
5.
go back to reference Zhang D, Chen S, Zhou Z (2007) Constraint score: a new filter method for feature selection with pairwise constraints. Pattern Recognit 41(5):1440–1451 Zhang D, Chen S, Zhou Z (2007) Constraint score: a new filter method for feature selection with pairwise constraints. Pattern Recognit 41(5):1440–1451
6.
go back to reference Pal M (2011) Fuzzy entropy based feature selection for classification of hyperspectral data. Dimensions and Directions of Geospatial Industry, pp 18–21 Pal M (2011) Fuzzy entropy based feature selection for classification of hyperspectral data. Dimensions and Directions of Geospatial Industry, pp 18–21
7.
go back to reference Saradha A, Annandurai S (2005) A hybrid feature extraction approach for face recognition systems. Int J Graph Vis Image Process 5(5):23–30 Saradha A, Annandurai S (2005) A hybrid feature extraction approach for face recognition systems. Int J Graph Vis Image Process 5(5):23–30
8.
go back to reference Camastra F, Vinciarelli A (2008) Machine learning for audio, image and video analysis, 1st edn. Springer, Berlin, pp 305–341MATHCrossRef Camastra F, Vinciarelli A (2008) Machine learning for audio, image and video analysis, 1st edn. Springer, Berlin, pp 305–341MATHCrossRef
9.
go back to reference Yang B (2009) SVM-induced dimensionality reduction and classification. In: 2009 second international conference on intelligent computation technology and automation. Yang B (2009) SVM-induced dimensionality reduction and classification. In: 2009 second international conference on intelligent computation technology and automation.
10.
11.
go back to reference Balakrishnama S, Ganapathirraju A (1998) Linear discriminate analysis. Institute for Signal and Information Processing, Mississippi State University Balakrishnama S, Ganapathirraju A (1998) Linear discriminate analysis. Institute for Signal and Information Processing, Mississippi State University
12.
go back to reference Cai D, He X, Han J (2007) Isometric projection. In: Proceedings of AAAI conference on artificial intelligence Cai D, He X, Han J (2007) Isometric projection. In: Proceedings of AAAI conference on artificial intelligence
13.
go back to reference He X, Cai D, Yan S, Zhang H (2005) Neighborhood preserving embedding. In: Proceedings in International Conference on Computer Vision (ICCV) He X, Cai D, Yan S, Zhang H (2005) Neighborhood preserving embedding. In: Proceedings in International Conference on Computer Vision (ICCV)
14.
go back to reference He X, Niyogi P (2003) Locality preserving projections. In: Proceedings of conference advances in neural information processing systems He X, Niyogi P (2003) Locality preserving projections. In: Proceedings of conference advances in neural information processing systems
15.
go back to reference Geng X, Zhan D-C, Zhou Z-H (2005) Supervised nonlinear dimensionality reduction for visualization and classification. IEEE Trans Syst Man Cybern Part B Cybern 35(6):1098–1107 Geng X, Zhan D-C, Zhou Z-H (2005) Supervised nonlinear dimensionality reduction for visualization and classification. IEEE Trans Syst Man Cybern Part B Cybern 35(6):1098–1107
16.
go back to reference de Ridder D, Kouropteva O, Okun O, Pietikäinen M, Duin RPW (2003) Supervised locally linear embedding. In: Proceedings of joint conference on artificial neural networks and neural information processing de Ridder D, Kouropteva O, Okun O, Pietikäinen M, Duin RPW (2003) Supervised locally linear embedding. In: Proceedings of joint conference on artificial neural networks and neural information processing
17.
go back to reference Silva C, Ribeiro B (2008) Selecting examples in manifold reduced feature space for active learning. In: 2008 seventh international conference on machine learning and applications Silva C, Ribeiro B (2008) Selecting examples in manifold reduced feature space for active learning. In: 2008 seventh international conference on machine learning and applications
18.
go back to reference Cai D, He XF, Kun Z, Han JW, Bao HJ (2007) Locality sensitive discriminant analysis. In: Proceedings of the International Joint Conference on Artificial Intelligence (IJCAI), Hyderabad, India, pp 141–146 Cai D, He XF, Kun Z, Han JW, Bao HJ (2007) Locality sensitive discriminant analysis. In: Proceedings of the International Joint Conference on Artificial Intelligence (IJCAI), Hyderabad, India, pp 141–146
19.
go back to reference Lukui S, Jun Z, Enhai L, Pilian H (2007) Text classification based on nonlinear dimensionality reduction techniques and support vector machines. In: Third international conference on natural computation, pp 674–677 Lukui S, Jun Z, Enhai L, Pilian H (2007) Text classification based on nonlinear dimensionality reduction techniques and support vector machines. In: Third international conference on natural computation, pp 674–677
20.
go back to reference Bruske J, Sommer G (1997) An algorithm for intrinsic dimensionality estimation. In: Sommer G, Daniilidis K, Pauli J (eds) Computer analysis of images and patterns. Lecture Notes in Computer Science, vol 1296. Springer, Berlin, pp 9–16 Bruske J, Sommer G (1997) An algorithm for intrinsic dimensionality estimation. In: Sommer G, Daniilidis K, Pauli J (eds) Computer analysis of images and patterns. Lecture Notes in Computer Science, vol 1296. Springer, Berlin, pp 9–16
21.
go back to reference Camastra F (2003) Data dimensionality estimation methods: a survey. Pattern Recognit 36(12):2945–2954 Camastra F (2003) Data dimensionality estimation methods: a survey. Pattern Recognit 36(12):2945–2954
22.
go back to reference Costa J, Girotra A, Hero AO (2005) Estimating local intrinsic dimension with k-nearest neighbor graphs. IEEE workshop on Statistical Signal Processing (SSP), Bordeaux Costa J, Girotra A, Hero AO (2005) Estimating local intrinsic dimension with k-nearest neighbor graphs. IEEE workshop on Statistical Signal Processing (SSP), Bordeaux
23.
go back to reference Tenenbaum JB, de Silva V, Langford JC (2000) A global geometric framework for nonlinear dimensionality reduction. Science 290(5500):2319–2323CrossRef Tenenbaum JB, de Silva V, Langford JC (2000) A global geometric framework for nonlinear dimensionality reduction. Science 290(5500):2319–2323CrossRef
24.
go back to reference Camastra F, Vinciarelli A (2002) Estimating the intrinsic dimension of data with a fractal-based method. IEEE Trans Pattern Anal Mach Intell 24(10):1404–1407CrossRef Camastra F, Vinciarelli A (2002) Estimating the intrinsic dimension of data with a fractal-based method. IEEE Trans Pattern Anal Mach Intell 24(10):1404–1407CrossRef
25.
go back to reference Kegl B (2002) Intrinsic dimension estimation using packing numbers. Neural Information Processing Systems, Vancouver Kegl B (2002) Intrinsic dimension estimation using packing numbers. Neural Information Processing Systems, Vancouver
26.
go back to reference Levina E, Bickel P (2005) Maximum likelihood estimation of intrinsic dimension. Adv Neural Inf Process Syst 17:777–784 Levina E, Bickel P (2005) Maximum likelihood estimation of intrinsic dimension. Adv Neural Inf Process Syst 17:777–784
27.
go back to reference Xiao R, Zhao Q, Zhang D, Shi P (2010) Data classification on multiple manifolds. In: 2010 international conference on pattern recognition, pp 3898–3901 Xiao R, Zhao Q, Zhang D, Shi P (2010) Data classification on multiple manifolds. In: 2010 international conference on pattern recognition, pp 3898–3901
28.
go back to reference Carter KM (2010) On local intrinsic dimension estimation and its applications. IEEE Trans Signal Process 58(2):650–663 Carter KM (2010) On local intrinsic dimension estimation and its applications. IEEE Trans Signal Process 58(2):650–663
29.
go back to reference Goldberg AB, Zhu X, Singh A, Xu Z, Nowak R (2009) Multi-manifold semi-supervised learning. In: Proceedings of the twelfth international conference on artificial intelligence and statistics Goldberg AB, Zhu X, Singh A, Xu Z, Nowak R (2009) Multi-manifold semi-supervised learning. In: Proceedings of the twelfth international conference on artificial intelligence and statistics
30.
go back to reference Wang Y, Jiang Y, Wu Y, Zhou Z-H (2010) Multi-manifold clustering. In: Proceedings of Pacific rim international conference on artificial intelligence, pp 280–291 Wang Y, Jiang Y, Wu Y, Zhou Z-H (2010) Multi-manifold clustering. In: Proceedings of Pacific rim international conference on artificial intelligence, pp 280–291
31.
go back to reference Anand A, Suganthan PN (2009) Multiclass cancer classification by support vector machines with class-wise optimized genes and probability estimates. J Theor Biol 533–540 Anand A, Suganthan PN (2009) Multiclass cancer classification by support vector machines with class-wise optimized genes and probability estimates. J Theor Biol 533–540
32.
go back to reference Kreβel UH-G (1999) Pairwise classification and support vector machines. In: Scholkopf B, Burges CJC, Smola AJ (eds) Advances in kernel methods: support vector learning. MIT Press, Cambridge, pp 255–268 Kreβel UH-G (1999) Pairwise classification and support vector machines. In: Scholkopf B, Burges CJC, Smola AJ (eds) Advances in kernel methods: support vector learning. MIT Press, Cambridge, pp 255–268
33.
go back to reference Roweis ST, Saul LK (2000) Nonlinear dimensionality reduction by locally linear embedding. Science 290(5500):2323–2326CrossRef Roweis ST, Saul LK (2000) Nonlinear dimensionality reduction by locally linear embedding. Science 290(5500):2323–2326CrossRef
34.
go back to reference Belkin M, Niyogi P (2002) Laplacian eigenmaps for dimensionality reduction and data representation. Technical Report TR-2002-01, Department of Computer Science, University of Chicago Belkin M, Niyogi P (2002) Laplacian eigenmaps for dimensionality reduction and data representation. Technical Report TR-2002-01, Department of Computer Science, University of Chicago
Metadata
Title
An adaptive class pairwise dimensionality reduction algorithm
Authors
Lifang He
Xiaowei Yang
Zhifeng Hao
Publication date
01-08-2013
Publisher
Springer London
Published in
Neural Computing and Applications / Issue 2/2013
Print ISSN: 0941-0643
Electronic ISSN: 1433-3058
DOI
https://doi.org/10.1007/s00521-012-0897-2

Other articles of this Issue 2/2013

Neural Computing and Applications 2/2013 Go to the issue

Premium Partner