ABSTRACT
This paper proposes LCSE, a learning classifier system ensemble, which is an extension of the classical learning classifier system(LCS). The classical LCS includes two major modules, a genetic algorithm module used to facilitate rule discovery, and a reinforcement learning module used to adjust the strength of the corresponding rules while it receives the rewards from the environment. In LCSE we build a two-level ensemble architecture to enhance the generalization of LCS. In the first-level, new instances are first bootstrapped and sent to several LCSs for classification. Then, in the second-level, a plurality-vote method is used to combine the classification results of individual LCSs into a final decision. Experiments on some benchmark data sets from the UCI repository have shown that LCSE has better generalization ability than the single LCS and other supervised learning methods.
- Jaume Bacardit, Martin V. Butz, Data Mining in Learning Classifier Systems: Comparing XCS with GAssist, in L. Bull (ed), Applications of Learning Classifier Systems, Springer-Verlag LNAI Series, In press.]]Google Scholar
- Ester Bernadó, Xavier Llorà, Josep M. Garrell, XCS and GALE: A Comparative Study of Two Learning Classifier Systems on Data Mining. In: Pier Luca Lanzi, Wolfgang Stolzmann, Stewart W. Wilson, editors, Advances in Learning Classifier Systems. LNAI 2321, pages 115--132, Springer-Verlag, Berlin, 2002.]] Google ScholarDigital Library
- P. Bonelli, A. Parodi, An Efficient Classifier System and Its Experimental Comparison with Two Representative Learning Methods on Three Medical Domains. In R. K. Belew and L. B. Booker, editors, Proceedings of the fourth international conference on Genetic algorithms(ICGA-4), pages 288--295. San Mateo, CA:Morgan Kaufmann, 1991.]]Google Scholar
- Leo Breiman, Bagging Predictors, Machine Learning, Vol.24, No.2, pp.123--140, 1996.]] Google ScholarDigital Library
- John H. Holmes, Learning Classifier Systems Applied to Knowledge Discovery in Clinical Research Databases. In: Pier Luca Lanzi, Wolfgang Stolzmann, Stewart W. Wilson, editors, Learning Classifier Systems. From Foundations to Applications. LNAI 1813, pages 243--261, Springer-Verlag, Berlin, 2000.]] Google ScholarDigital Library
- John H. Holmes, Applying a Learning Classifier System to Mining Explanatory and Predictive Models from a Large Database. In: Pier Luca Lanzi, Wolfgang Stolzmann, Stewart W. Wilson, editors, Advances in Learning Classifier Systems. LNAI 1996, pages 103--113, Springer-Verlag, Berlin, 2001.]] Google ScholarDigital Library
- Stewart W. Wilson. Get Real! XCS with continous-valued inputs. In P. L. Lanzi, W. Stolzmann and S. W. Wilson (eds.), Learning Classifier Systems. From Foundations to Applications. LNAI 1813, pages 209--219, Springer-Verlag, Berlin, 2000.]] Google ScholarDigital Library
- Stewart W. Wilson, Mining Obilque Data with XCS. In: Pier Luca Lanzi, Wolfgang Stolzmann, Stewart W. Wilson, editors, Advances in Learning Classifier Systems. LNAI 1996, pages 158--174, Springer-Verlag, Berlin, 2001.]] Google ScholarDigital Library
- http://www.cs.waikato.ac.nz/ ml/weka/, Last visit at 22, Dec., 2004.]]Google Scholar
- Z.-H. Zhou, J. Wu, and W. Tang. Ensembling neural networks: many could be better than all. Artificial Intelligence, 137(1--2): 239--263, 2002.]] Google ScholarDigital Library
Index Terms
- Learning classifier system ensemble for data mining
Recommendations
Constructing a multi-class classifier using one-against-one approach with different binary classifiers
For the one-against-one approach, all the binary classifiers that form a one-against-one classifier should be sufficiently competent. If some of the classifiers are not competent, the consequences might be invalid classification results. To address the ...
Learning classifier system ensemble and compact rule set
Evolutionary Learning and OptimisationThis paper presents a learning classifier system ensemble for knowledge discovery from incremental data. The new ensemble was designed with a two-level architecture to improve the generalization ability. The new incoming cases are first bootstrapped to ...
An ensemble outlier detection method for multiclass classification problem in data mining
DSIT '18: Proceedings of the 2018 International Conference on Data Science and Information TechnologyWe proposed to develop a heterogeneous ensemble method that boost the performance of random forest classifier. The proposed method utilized the boosting capability of adaboost algorithm and the feature selection and bagging capability of random subspace ...
Comments