2009 | OriginalPaper | Chapter
A Wrapper Method for Feature Selection in Multiple Classes Datasets
Authors : Noelia Sánchez-Maroño, Amparo Alonso-Betanzos, Rosa M. Calvo-Estévez
Published in: Bio-Inspired Systems: Computational and Ambient Intelligence
Publisher: Springer Berlin Heidelberg
Activate our intelligent search to find suitable subject content or patents.
Select sections of text to find matching patents with Artificial Intelligence. powered by
Select sections of text to find additional relevant content using AI-assisted search. powered by
Feature selection algorithms should remove irrelevant and redundant features while maintaining or even improving performance, and thus contributing to enhance generalization in learning models. Feature selection methods can be mainly grouped into filters and wrappers. Most of the models built can deal more or less adequately with binary problems, but often under perform on multi-class tasks. In this article, a new wrapper method, called IAFN-FS (Incremental ANOVA and Functional Networks-Feature Selection) is described in its version for dealing with multiclass problems. In order to carry out the multiclass approach, two different alternatives were tried: (a) treating directly the multiclass problem, (b) dividing the original multiclass problem in several binary problems. In order to evaluate the performance of both approaches, a comparative study using several benchmark datasets, our two methods and other wrappers based in classical algorithms, such as C4.5 and Naive-Bayes, was carried out.