2009 | OriginalPaper | Buchkapitel
Self-Organizing Neural Grove: Efficient Multiple Classifier System with Pruned Self-Generating Neural Trees
verfasst von : Hirotaka Inoue
Erschienen in: Constructive Neural Networks
Verlag: Springer Berlin Heidelberg
Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.
Wählen Sie Textabschnitte aus um mit Künstlicher Intelligenz passenden Patente zu finden. powered by
Markieren Sie Textabschnitte, um KI-gestützt weitere passende Inhalte zu finden. powered by
Multiple classifier systems (MCS) have become popular during the last decade. Self-generating neural tree (SGNT) is a suitable base-classifier for MCS because of the simple setting and fast learning capability. However, the computation cost of the MCS increases in proportion to the number of SGNTs. In an earlier paper, we proposed a pruning method for the structure of the SGNT in the MCS to reduce the computational cost. In this paper, we propose a novel pruning method for more effective processing and we call this model self-organizing neural grove (SONG). The pruning method is constructed from both an on-line and an off-line pruning method. Experiments have been conducted to compare the SONG with an unprunedMCS based on SGNT, anMCS based on C4.5, and the
k
-nearest neighbor method. The results show that the SONG can improve its classification accuracy as well as reducing the computation cost.