Skip to main content
Erschienen in: Neural Computing and Applications 2/2009

01.02.2009 | Original Article

Particle swarm optimization for ensembling generation for evidential k-nearest-neighbour classifier

verfasst von: Loris Nanni, Alessandra Lumini

Erschienen in: Neural Computing and Applications | Ausgabe 2/2009

Einloggen

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

The problem addressed in this paper concerns the ensembling generation for evidential k-nearest-neighbour classifier. An efficient method based on particle swarm optimization (PSO) is here proposed. We improve the performance of the evidential k-nearest-neighbour (EkNN) classifier using a random subspace based ensembling method. Given a set of random subspace EkNN classifier, a PSO is used for obtaining the best parameters of the set of evidential k-nearest-neighbour classifiers, finally these classifiers are combined by the “vote rule”. The performance improvement with respect to the state-of-the-art approaches is validated through experiments with several benchmark datasets.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Fußnoten
1
parameter of the EkNN classifier.
 
2
These algorithms are already implemented as MATLAB code by the authors of the algorithm and can be downloaded from http://​www.​hds.​utc.​fr/​tdenoeux/​software.​htm.
 
3
It is implemented as in PSO MATLAB TOOLBOX; it is available at http://​psotoolbox.​sourceforge.​net.
 
Literatur
1.
Zurück zum Zitat Opitz D, Maclin R (1999) Popular ensemble methods: an empirical study. J Artif Intell Res 11:169–198MATH Opitz D, Maclin R (1999) Popular ensemble methods: an empirical study. J Artif Intell Res 11:169–198MATH
2.
Zurück zum Zitat Kittler J (1998) On combining classifiers. IEEE Trans Pattern Anal Mach Intell 20(3):226–239CrossRef Kittler J (1998) On combining classifiers. IEEE Trans Pattern Anal Mach Intell 20(3):226–239CrossRef
3.
Zurück zum Zitat Altıncay H, Demirekler M (2000) An information theoretic framework for weight estimation in the combination of probabilistic classifiers for speaker identification. Speech Commun 30(4):255–272CrossRef Altıncay H, Demirekler M (2000) An information theoretic framework for weight estimation in the combination of probabilistic classifiers for speaker identification. Speech Commun 30(4):255–272CrossRef
4.
Zurück zum Zitat Whitaker CJ, Kuncheva LI (2003) Examining the relationship between majority vote accuracy and diversity in bagging and boosting, Technical Report, School of Informatics, University of Wales, Bangor Whitaker CJ, Kuncheva LI (2003) Examining the relationship between majority vote accuracy and diversity in bagging and boosting, Technical Report, School of Informatics, University of Wales, Bangor
5.
Zurück zum Zitat Zenobi G, Cunningham P (2001) Using diversity in preparing ensembles of classifiers based on different feature subsets to minimize generalization error. In: Raedt LD, Flach PA (eds) Proceedings of the 12th conference on machine learning, Lecture notes in computer science 2167, pp 576–587 Zenobi G, Cunningham P (2001) Using diversity in preparing ensembles of classifiers based on different feature subsets to minimize generalization error. In: Raedt LD, Flach PA (eds) Proceedings of the 12th conference on machine learning, Lecture notes in computer science 2167, pp 576–587
6.
Zurück zum Zitat Melville P, Mooney RJ (2003) Constructing diverse classifier ensembles using artificial training examples. In: Proceedings of the IJCAI, pp 505–510 Melville P, Mooney RJ (2003) Constructing diverse classifier ensembles using artificial training examples. In: Proceedings of the IJCAI, pp 505–510
7.
Zurück zum Zitat Zhou Z, Yu Y (2005) Ensembling local learners through multimodal perturbation. IEEE Trans Syst Man Cybern B Cybern 35(4):725–735CrossRef Zhou Z, Yu Y (2005) Ensembling local learners through multimodal perturbation. IEEE Trans Syst Man Cybern B Cybern 35(4):725–735CrossRef
9.
Zurück zum Zitat Schapire RE (2002) The boosting approach to machine learning: an overview. In: MSRI workshop on nonlinear estimation and classification, Berkeley Schapire RE (2002) The boosting approach to machine learning: an overview. In: MSRI workshop on nonlinear estimation and classification, Berkeley
10.
Zurück zum Zitat Ho TK (1998) The random subspace method for constructing decision forests. IEEE Trans Pattern Anal Mach Intell 20(8):832–844CrossRef Ho TK (1998) The random subspace method for constructing decision forests. IEEE Trans Pattern Anal Mach Intell 20(8):832–844CrossRef
11.
12.
Zurück zum Zitat Gen M, Cheng R (1997) Genetic algorithms and engineering design. Wiley, New York Gen M, Cheng R (1997) Genetic algorithms and engineering design. Wiley, New York
13.
Zurück zum Zitat Leardi R (1994) Application of a genetic algorithm to feature selection under full validation conditions and to outlier detection. J Chemom 8:65–79CrossRef Leardi R (1994) Application of a genetic algorithm to feature selection under full validation conditions and to outlier detection. J Chemom 8:65–79CrossRef
14.
Zurück zum Zitat Siedlecki W, Sklansky J (1989) A note on genetic algorithms for large scale feature selection. Pattern Recognit Lett 10:335–347MATHCrossRef Siedlecki W, Sklansky J (1989) A note on genetic algorithms for large scale feature selection. Pattern Recognit Lett 10:335–347MATHCrossRef
15.
Zurück zum Zitat Zhang GP (2000) Neural networks for classification: a survey. IEEE Trans Syst Man Cybern C Appl Rev 30(4):451–462CrossRef Zhang GP (2000) Neural networks for classification: a survey. IEEE Trans Syst Man Cybern C Appl Rev 30(4):451–462CrossRef
16.
Zurück zum Zitat Kennedy J, Eberhart RC (1995a) Particle swarm optimization. In: Proceedings of IEEE international conference on neural networks, Perth, pp 1942–1948 Kennedy J, Eberhart RC (1995a) Particle swarm optimization. In: Proceedings of IEEE international conference on neural networks, Perth, pp 1942–1948
17.
Zurück zum Zitat Kennedy J, Eberhart RC (1995b) A new optimizer using particle swarm theory. In: Sixth international symposium on micro machine and human science, Nagoya, pp 39–43 Kennedy J, Eberhart RC (1995b) A new optimizer using particle swarm theory. In: Sixth international symposium on micro machine and human science, Nagoya, pp 39–43
18.
Zurück zum Zitat Kennedy J, Spears WM (1998) Matching algorithms to problems: an experimental test of the particle swarm and some genetic algorithms on the multimodal problem generator. In: Proceedings of the IEEE international conference on evolutionary computation, pp 39–43 Kennedy J, Spears WM (1998) Matching algorithms to problems: an experimental test of the particle swarm and some genetic algorithms on the multimodal problem generator. In: Proceedings of the IEEE international conference on evolutionary computation, pp 39–43
19.
Zurück zum Zitat Wang X, Yang J, Teng X, Xia W, Jensen R (2006) Feature selection based on rough sets and particle swarm optimization. Pattern Recognit Lett (to appear) Wang X, Yang J, Teng X, Xia W, Jensen R (2006) Feature selection based on rough sets and particle swarm optimization. Pattern Recognit Lett (to appear)
20.
Zurück zum Zitat Altıncay H (2006) Ensembling evidential k-nearest neighbor classifiers through multi-modal perturbatio. Appl Soft Comput (to appear) Altıncay H (2006) Ensembling evidential k-nearest neighbor classifiers through multi-modal perturbatio. Appl Soft Comput (to appear)
21.
Zurück zum Zitat Denoeux T (1995) A k-nearest neighbor classification rule based on Dempster-Shafer theory. IEEE Trans Syst Man Cybern 25(05):804–813CrossRef Denoeux T (1995) A k-nearest neighbor classification rule based on Dempster-Shafer theory. IEEE Trans Syst Man Cybern 25(05):804–813CrossRef
22.
Zurück zum Zitat Zouhal LM, Denoeux T (1998) An evidence-theoretic k-NN rule with parameter optimization. IEEE Trans Syst Man Cybern C 28(2):263–271CrossRef Zouhal LM, Denoeux T (1998) An evidence-theoretic k-NN rule with parameter optimization. IEEE Trans Syst Man Cybern C 28(2):263–271CrossRef
23.
Zurück zum Zitat Gabrys G, Ruta D (2006) Genetic algorithms in classifier fusion. Appl Soft Comput 6(4):337–347CrossRef Gabrys G, Ruta D (2006) Genetic algorithms in classifier fusion. Appl Soft Comput 6(4):337–347CrossRef
24.
Zurück zum Zitat Yu E, Cho S (2006) Ensemble based on GA wrapper feature selection. Comput Industr Eng 51:111–116CrossRef Yu E, Cho S (2006) Ensemble based on GA wrapper feature selection. Comput Industr Eng 51:111–116CrossRef
25.
Zurück zum Zitat Nanni L, Lumini A (2006) Particle swarm optimization for prototype reduction. IEEE Trans Circuits Syst (submitted) Nanni L, Lumini A (2006) Particle swarm optimization for prototype reduction. IEEE Trans Circuits Syst (submitted)
26.
Zurück zum Zitat Demsar J (2006) Statistical comparisons of classifiers over multiple data sets. J Mach Learn Res 7:1–30MathSciNet Demsar J (2006) Statistical comparisons of classifiers over multiple data sets. J Mach Learn Res 7:1–30MathSciNet
27.
Zurück zum Zitat Kuncheva L (2004) Combining pattern classifiers. Wiley, New York Kuncheva L (2004) Combining pattern classifiers. Wiley, New York
Metadaten
Titel
Particle swarm optimization for ensembling generation for evidential k-nearest-neighbour classifier
verfasst von
Loris Nanni
Alessandra Lumini
Publikationsdatum
01.02.2009
Verlag
Springer-Verlag
Erschienen in
Neural Computing and Applications / Ausgabe 2/2009
Print ISSN: 0941-0643
Elektronische ISSN: 1433-3058
DOI
https://doi.org/10.1007/s00521-007-0162-2

Weitere Artikel der Ausgabe 2/2009

Neural Computing and Applications 2/2009 Zur Ausgabe