Skip to main content
Erschienen in: Pattern Analysis and Applications 3/2005

01.12.2005 | Theoretical Advances

An ensemble-based method for linear feature extraction for two-class problems

Erschienen in: Pattern Analysis and Applications | Ausgabe 3/2005

Einloggen

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

In this paper we propose three variants of a linear feature extraction technique based on Adaboost for two-class classification problems. Unlike other feature extraction techniques, we do not make any assumptions about the distribution of the data. At each boosting step we select from a pool of linear projections the one that minimizes the weighted error. We propose three different variants of the feature extraction algorithm, depending on the way the pool of individual projections is constructed. Using nine real and two artificial data sets of different original dimensionality and sample size we compare the performance of the three proposed techniques with three classical techniques for linear feature extraction: Fisher linear discriminant analysis (FLD), Nonparametric discriminant analysis (NDA) and a recently proposed feature extraction method for heteroscedastic data based on the Chernoff criterion. Our results show that for data sets of relatively low-original dimensionality FLD appears to be both the most accurate and the most economical feature extraction method (giving just one-dimension in the case of two classes). The techniques based on Adaboost fare better than the classical techniques for data sets of large original dimensionality.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Fußnoten
1
We assume that the reader is familiar with Adaboost although the feature extraction should be reproducible from the Fig. 3
 
2
Functions from the PRTOOLS 3.1.7 toolbox [24] have been used for classifiers 1–3. For the SVM classifier we used the OSU SVM Classifier Matlab toolbox 3.00 that can be downloaded from http://​www.​ece.​osu.​edu/​∼maj/​osu_​svm/​.
 
3
Full information about the standard deviations and the calculated confidence intervals can be found at http://​www.​cvc.​uab.​es/​∼davidm/​experiments.​htm
 
Literatur
1.
Zurück zum Zitat Hyvarinen A, Karhunen J, Oja E (2001) Independent component analysis. John Wiley and Sons Hyvarinen A, Karhunen J, Oja E (2001) Independent component analysis. John Wiley and Sons
2.
Zurück zum Zitat Friedman JH (1987) Explanatory projection pursuit. J Am Statistical Assoc 82:249–266MATHCrossRef Friedman JH (1987) Explanatory projection pursuit. J Am Statistical Assoc 82:249–266MATHCrossRef
3.
Zurück zum Zitat Lee DD, Seung HS (1999) Learning the parts of objects with nonnegative matrix factorization. Nature 401:788–791CrossRefPubMed Lee DD, Seung HS (1999) Learning the parts of objects with nonnegative matrix factorization. Nature 401:788–791CrossRefPubMed
4.
Zurück zum Zitat Fisher R (1936) The use of multiple measurements in taxonomic problems. Ann Eugenics 7:179–188 Fisher R (1936) The use of multiple measurements in taxonomic problems. Ann Eugenics 7:179–188
5.
Zurück zum Zitat Fukunaga K, Mantock J (1983) Nonparametric discriminant analysis. IEEE T Pattern 5(6):671–678MATHCrossRef Fukunaga K, Mantock J (1983) Nonparametric discriminant analysis. IEEE T Pattern 5(6):671–678MATHCrossRef
6.
Zurück zum Zitat Loog M, Duin RPW (2004) Linear dimensionality reduction via a heteroscedastic extension of lda: The Chernoff criterion. IEEE T Pattern Anal 26(6):732–739CrossRef Loog M, Duin RPW (2004) Linear dimensionality reduction via a heteroscedastic extension of lda: The Chernoff criterion. IEEE T Pattern Anal 26(6):732–739CrossRef
7.
Zurück zum Zitat McLachlan GJ (2004) Discriminant analysis and statistical pattern recognition. John Wiley and Sons, Inc, New YorkMATH McLachlan GJ (2004) Discriminant analysis and statistical pattern recognition. John Wiley and Sons, Inc, New YorkMATH
8.
Zurück zum Zitat RST, SLK (2000) Nonlinear dimensionality reduction by locally linear embedding. Science 290:2323–2326 RST, SLK (2000) Nonlinear dimensionality reduction by locally linear embedding. Science 290:2323–2326
9.
Zurück zum Zitat Tenenbaum JB, de Silva V, Langford JC (2000) A global geometric framework for nonlinear dimensionality reduction. Science 290(5500):2319–2323CrossRef Tenenbaum JB, de Silva V, Langford JC (2000) A global geometric framework for nonlinear dimensionality reduction. Science 290(5500):2319–2323CrossRef
10.
11.
Zurück zum Zitat Athitsos V, Alon J, Sclaroff S, Kollios G (2004) Boostmap: a method for efficient approximate similarity rankings. In: CVPR (2), 2004, pp 268–275 Athitsos V, Alon J, Sclaroff S, Kollios G (2004) Boostmap: a method for efficient approximate similarity rankings. In: CVPR (2), 2004, pp 268–275
12.
Zurück zum Zitat Sirlantzis K, Hoque S, Fairhurst MC (2002) Trainable multiple classifier schemes for handwritten character recognition. In: Multiple classifier systems, 2002, pp 169–178 Sirlantzis K, Hoque S, Fairhurst MC (2002) Trainable multiple classifier schemes for handwritten character recognition. In: Multiple classifier systems, 2002, pp 169–178
13.
Zurück zum Zitat Brown G, Yao X, Wyatt J, Wersing H, Sendhoff B (2002) Exploiting ensemble diversity for automatic feature extraction. In: Proc. of the 9th international conference on neural information processing (ICONIP’02), 2002, pp 1786–1790 Brown G, Yao X, Wyatt J, Wersing H, Sendhoff B (2002) Exploiting ensemble diversity for automatic feature extraction. In: Proc. of the 9th international conference on neural information processing (ICONIP’02), 2002, pp 1786–1790
14.
Zurück zum Zitat Kuncheva LI (2004) Combining pattern classifiers. John Wiley and SonsMATH Kuncheva LI (2004) Combining pattern classifiers. John Wiley and SonsMATH
16.
Zurück zum Zitat Kirby M, Sirovich L (1990) Application of the Karhunen-Loeve procedure for the characterization of human faces. IEEE T Pattern Anal 12(1):103–108CrossRef Kirby M, Sirovich L (1990) Application of the Karhunen-Loeve procedure for the characterization of human faces. IEEE T Pattern Anal 12(1):103–108CrossRef
17.
Zurück zum Zitat Fukunaga K (1990) Introduction to statistical pattern recognition, 2nd edn. Academic Press, BostonMATH Fukunaga K (1990) Introduction to statistical pattern recognition, 2nd edn. Academic Press, BostonMATH
18.
Zurück zum Zitat Bressan M, Vitria J (2003) Nonparametric discriminant analysis and nearest neighbor classification. Pattern Recogn Lett 24(15):2743–2749CrossRef Bressan M, Vitria J (2003) Nonparametric discriminant analysis and nearest neighbor classification. Pattern Recogn Lett 24(15):2743–2749CrossRef
19.
Zurück zum Zitat Schapire RE (1999) A brief introduction to boosting. In: IJCAI, 1999, pp 1401–1406 Schapire RE (1999) A brief introduction to boosting. In: IJCAI, 1999, pp 1401–1406
20.
Zurück zum Zitat Freund Y, Schapire RE (1996) Experiments with a new boosting algorithm. In: International conference on machine learning, 1996, pp 148–156 Freund Y, Schapire RE (1996) Experiments with a new boosting algorithm. In: International conference on machine learning, 1996, pp 148–156
22.
Zurück zum Zitat Skurichina M (2001) Stabilizing weak classifiers, Ph.D. thesis, Delft University of Technology Skurichina M (2001) Stabilizing weak classifiers, Ph.D. thesis, Delft University of Technology
23.
Zurück zum Zitat Martinez A, Benavente R (1998) The AR face database. Tech Rep 24, Computer Vision Center (June 1998) Martinez A, Benavente R (1998) The AR face database. Tech Rep 24, Computer Vision Center (June 1998)
Metadaten
Titel
An ensemble-based method for linear feature extraction for two-class problems
Publikationsdatum
01.12.2005
Erschienen in
Pattern Analysis and Applications / Ausgabe 3/2005
Print ISSN: 1433-7541
Elektronische ISSN: 1433-755X
DOI
https://doi.org/10.1007/s10044-005-0002-x

Weitere Artikel der Ausgabe 3/2005

Pattern Analysis and Applications 3/2005 Zur Ausgabe

Premium Partner