Skip to main content
Erschienen in: International Journal of Machine Learning and Cybernetics 1/2011

01.03.2011 | Original Article

An improved multiple fuzzy NNC system based on mutual information and fuzzy integral

verfasst von: Li Juan Wang

Erschienen in: International Journal of Machine Learning and Cybernetics | Ausgabe 1/2011

Einloggen

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

Multiple nearest neighbor classifier system (MNNCS) is a popular method to relax the curse of dimensionality. In previous work, most of the MNNCSs are designed by random methods. Random methods may generate unstable component classifiers. In order to relax the randomness, large amount of component classifiers are needed. This paper first extends nearest neighbor classifier into fuzzy nearest neighbor classifier, and proposes a new multiple fuzzy nearest neighbor classifier system based on mutual information and fuzzy integral, called MIFI-MFNNCS. MIFI-MFNNCS adopts target perturbation. Target perturbation decomposes the original classification problem into several sub-problems, where one sub-problem represents one class data. Each sub-problem is described by the relevant data and features. Then it is classified by one component classifier. Therefore, the number of component classifiers can be fixed and reduced. For one component classifier, data may be selected according to its class. And feature is needed to be selected by mutual information. Mutual information can reduce the uncertainty of each component classifier. Feature selection by mutual information in MIFI-MFNNCS may be less affected by the interaction among different classes. The diversity decisions from sub-problem classifiers are combined by fuzzy integral to get the final decision. Here we propose a new method to compute density value according to mutual information, which is a simple method. To demonstrate the performance of the proposed MIFI-MFNNCS, we perform experimental comparisons using five UCI datasets. The results of component classifiers in MIFI-MFNNCS for Ionosphere are shown and analyzed. MIFI-MFNNCS is compared with (1) NNC (2) NNC after feature selection by mutual information (MI-FS-NNC). In multiple fuzzy nearest neighbor classifier system (MFNNCS), mutual information is compared with attribute bagging. And three combination methods are compared, including fuzzy integral, majority voting rule and average. The experimental results show that the accuracy of MIFI-MFNNCS is better than other methods. And mutual information is superior to attribute bagging. Fuzzy integral shows a better performance than majority voting rule and average.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Weitere Produktempfehlungen anzeigen
Literatur
1.
Zurück zum Zitat Ludmila I (2004) Kuncheva, combining pattern classifiers: methods and algorithms, wiles Ludmila I (2004) Kuncheva, combining pattern classifiers: methods and algorithms, wiles
2.
Zurück zum Zitat Saranlm A, Demirekler M (2001) On output independence and complementariness in rank-based multiple classifer decision systems. Pattern Recogn 34:2319–2330CrossRef Saranlm A, Demirekler M (2001) On output independence and complementariness in rank-based multiple classifer decision systems. Pattern Recogn 34:2319–2330CrossRef
3.
Zurück zum Zitat Zhou ZH (2005) Ensembling local learners through multimodal perturbation. IEEE Trans SMC Part B 35(4):725–735 Zhou ZH (2005) Ensembling local learners through multimodal perturbation. IEEE Trans SMC Part B 35(4):725–735
5.
Zurück zum Zitat Freund Y, Schapire RE (1995) A decision-theoretic generalization of online learning and an application to boosting. Proceedings of the 2nd European conference on computational learning theory, Barcelona, Spain, pp 23–37 Freund Y, Schapire RE (1995) A decision-theoretic generalization of online learning and an application to boosting. Proceedings of the 2nd European conference on computational learning theory, Barcelona, Spain, pp 23–37
6.
Zurück zum Zitat Ho TK (1998) The random subspace method for constructing decision forests. IEEE Trans Pattern Anal Mach Intell 20(8):832–844CrossRef Ho TK (1998) The random subspace method for constructing decision forests. IEEE Trans Pattern Anal Mach Intell 20(8):832–844CrossRef
7.
Zurück zum Zitat Brylla R, Gutierrez-Osunab R, Queka F (2003) Attribute bagging: improving accuracy of classifer ensembles by using random feature subsets. Pattern Recogn 36:1291–1302CrossRef Brylla R, Gutierrez-Osunab R, Queka F (2003) Attribute bagging: improving accuracy of classifer ensembles by using random feature subsets. Pattern Recogn 36:1291–1302CrossRef
8.
Zurück zum Zitat Langley P, Iba W. Average-case analysis of a nearest neighbor algorithm. Proceedings of the thirteenth international joint Langley P, Iba W. Average-case analysis of a nearest neighbor algorithm. Proceedings of the thirteenth international joint
9.
Zurück zum Zitat Bay SD (1998) Combining nearest neighbor classifiers through multiple feature subsets. In: Proceedings of the 15th international conference on machine learning, Madison, WI, pp 37–45 Bay SD (1998) Combining nearest neighbor classifiers through multiple feature subsets. In: Proceedings of the 15th international conference on machine learning, Madison, WI, pp 37–45
10.
Zurück zum Zitat García-Pedrajas N, Ortiz-Boyer D (2009) Boosting k-nearest neighbor classifier by means of input space projection. Expert Syst Appl 36(7):10570–10582 García-Pedrajas N, Ortiz-Boyer D (2009) Boosting k-nearest neighbor classifier by means of input space projection. Expert Syst Appl 36(7):10570–10582
11.
Zurück zum Zitat Oza NC, Tumer K (2001) Input decimation ensembles: decorrelation through dimensio-nality reduction, 2nd international workshop on multiple classifier systems. In: Kittler J, Roli F (eds) Lecture notes in computer science. 2096:238–247 Oza NC, Tumer K (2001) Input decimation ensembles: decorrelation through dimensio-nality reduction, 2nd international workshop on multiple classifier systems. In: Kittler J, Roli F (eds) Lecture notes in computer science. 2096:238–247
12.
Zurück zum Zitat Chow TWS, Huang D (2005) Estimating optimal feature subsets using efficient estimation of high-dimensional mutual information. IEEE Trans Neural Netw 16(1):213–224 Chow TWS, Huang D (2005) Estimating optimal feature subsets using efficient estimation of high-dimensional mutual information. IEEE Trans Neural Netw 16(1):213–224
13.
Zurück zum Zitat Batttiti R (1994) Using mutual information for selecting features in supervised neural net learning. IEEE Trans Neural Netw 5(4):537–550 Batttiti R (1994) Using mutual information for selecting features in supervised neural net learning. IEEE Trans Neural Netw 5(4):537–550
14.
Zurück zum Zitat Kwak N, Choi CH (2002) Input feature selection for classification problems. IEEE Trans Neural Netw 13(1):143–159 Kwak N, Choi CH (2002) Input feature selection for classification problems. IEEE Trans Neural Netw 13(1):143–159
15.
Zurück zum Zitat Kwak N, Choi CH (2002) Input feature selection by mutual information based on parzen window. IEEE Trans PAMI 24(12):1667–1671 Kwak N, Choi CH (2002) Input feature selection by mutual information based on parzen window. IEEE Trans PAMI 24(12):1667–1671
16.
Zurück zum Zitat Huang J, Cai Y, Xu X (2007) A hybrid genetic algorithm for feature selection wrapper based on mutual information. Pattern Recogn Lett 28(13):1825–1844 Huang J, Cai Y, Xu X (2007) A hybrid genetic algorithm for feature selection wrapper based on mutual information. Pattern Recogn Lett 28(13):1825–1844
17.
Zurück zum Zitat François D, Rossib F, Wertza V, Verleysen M (2007) Resampling methods for parameter-free and robust feature selection with mutual information. Neurocomputing 70(7–9):1276–1288 François D, Rossib F, Wertza V, Verleysen M (2007) Resampling methods for parameter-free and robust feature selection with mutual information. Neurocomputing 70(7–9):1276–1288
18.
Zurück zum Zitat Liu HW, Sun JG, Liu L, Zhang HJ (2009) Feature selection with dynamic mutual information. Pattern Recogn 42(7):1330–1339 Liu HW, Sun JG, Liu L, Zhang HJ (2009) Feature selection with dynamic mutual information. Pattern Recogn 42(7):1330–1339
19.
Zurück zum Zitat Wang ZY, Klir GJ (1992) Fuzzy measure theory. Plenum Press, New YorkMATH Wang ZY, Klir GJ (1992) Fuzzy measure theory. Plenum Press, New YorkMATH
20.
21.
Zurück zum Zitat Wang ZY, Leung KS, Wang J (1999) A genetic algorithm for determining nonadditive set functions in information fusion. Fuzzy Sets Syst 102(3):463–469MathSciNetMATHCrossRef Wang ZY, Leung KS, Wang J (1999) A genetic algorithm for determining nonadditive set functions in information fusion. Fuzzy Sets Syst 102(3):463–469MathSciNetMATHCrossRef
22.
Zurück zum Zitat Wang J, Wang ZY (1997) Using neural networks to determine Sugeno measures by statics. Neural Netw 10(1):183–195CrossRef Wang J, Wang ZY (1997) Using neural networks to determine Sugeno measures by statics. Neural Netw 10(1):183–195CrossRef
23.
Zurück zum Zitat Hu BG, Wang Y (2008) Evaluation criteria based on mutual information for classifications including rejected class. Acta Automatica Sin 34(11):1396–1403CrossRef Hu BG, Wang Y (2008) Evaluation criteria based on mutual information for classifications including rejected class. Acta Automatica Sin 34(11):1396–1403CrossRef
24.
Zurück zum Zitat Wang LJ (2006) Combination of multiple K-NNCs by Fuzzy integral. Proceedings of the fifth international conference on machine learning and cybernetics, pp 1774–1778 Wang LJ (2006) Combination of multiple K-NNCs by Fuzzy integral. Proceedings of the fifth international conference on machine learning and cybernetics, pp 1774–1778
25.
Zurück zum Zitat Sungeno M (1997) Fuzzy measures and fuzzy integrals—a survey. In: Gupta MM, Saridis GN, Gaines BR (eds) Fuzzy automata and decision processes. North-Holland, Amsterdam, pp 89–102 Sungeno M (1997) Fuzzy measures and fuzzy integrals—a survey. In: Gupta MM, Saridis GN, Gaines BR (eds) Fuzzy automata and decision processes. North-Holland, Amsterdam, pp 89–102
Metadaten
Titel
An improved multiple fuzzy NNC system based on mutual information and fuzzy integral
verfasst von
Li Juan Wang
Publikationsdatum
01.03.2011
Verlag
Springer-Verlag
Erschienen in
International Journal of Machine Learning and Cybernetics / Ausgabe 1/2011
Print ISSN: 1868-8071
Elektronische ISSN: 1868-808X
DOI
https://doi.org/10.1007/s13042-010-0006-8

Weitere Artikel der Ausgabe 1/2011

International Journal of Machine Learning and Cybernetics 1/2011 Zur Ausgabe

Neuer Inhalt