Training algorithms for linear text classifiers
SIGIR '96 Proceedings of the 19th annual international ACM SIGIR conference on Research and development in information retrieval
Feature Selection: Evaluation, Application, and Small Sample Performance
IEEE Transactions on Pattern Analysis and Machine Intelligence
Text Categorization with Suport Vector Machines: Learning with Many Relevant Features
ECML '98 Proceedings of the 10th European Conference on Machine Learning
A Comparative Study on Feature Selection in Text Categorization
ICML '97 Proceedings of the Fourteenth International Conference on Machine Learning
A new feature selection score for multinomial naive Bayes text classification based on KL-divergence
ACLdemo '04 Proceedings of the ACL 2004 on Interactive poster and demonstration sessions
On the selection and classification of independent features
IEEE Transactions on Pattern Analysis and Machine Intelligence
Hi-index | 0.00 |
In this paper we focus on the problem of class discrimination issues to improve performance of text classification, and study a discrimination-based feature selection technique in which the features are selected based on the criterion of enlarging separation among competing classes, referred to as discrimination capability. The proposed approach discards features with small discrimination capability measured by Gaussian divergence, so as to enhance the robustness and the discrimination power of the text classification system. To evaluation its performance, some comparison experiments of multinomial naïve Bayes classifier model are constructed on Newsgroup and Ruters21578 data collection. Experimental results show that on Newsgroup data set divergence measure outperforms MI measure, and has slight better performance than DF measure, and outperforms both measures on Ruters21578 data set. It shows that discrimination-based feature selection method has good contributions to enhance discrimination power of text classification model.