2014 | OriginalPaper | Chapter
A CFS-Based Feature Weighting Approach to Naive Bayes Text Classifiers
Authors : Shasha Wang, Liangxiao Jiang, Chaoqun Li
Published in: Artificial Neural Networks and Machine Learning – ICANN 2014
Publisher: Springer International Publishing
Activate our intelligent search to find suitable subject content or patents.
Select sections of text to find matching patents with Artificial Intelligence. powered by
Select sections of text to find additional relevant content using AI-assisted search. powered by
Recent work in supervised learning has shown that naive Bayes text classifiers with strong assumptions of independence among features, such as multinomial naive Bayes (MNB), complement naive Bayes (CNB) and the one-versus-all-but-one model (OVA), have achieved remarkable classification performance. This fact raises the question of whether a naive Bayes text classifier with less restrictive assumptions can perform even better. Responding to this question, we firstly evaluate the correlation-based feature selection (CFS) approach in this paper and find that it performs even worse than the original versions. Then, we propose a CFS-based feature weighting approach to these naive Bayes text classifiers. We call our feature weighted versions FWMNB, FWCNB and FWOVA respectively. Our proposed approach weakens the strong assumptions of independence among features by weighting the correlated features. The experimental results on a large suite of benchmark datasets show that our feature weighted versions significantly outperform the original versions in terms of classification accuracy.