A practical approach to feature selection
ML92 Proceedings of the ninth international workshop on Machine learning
Induction of Recursive Bayesian Classifiers
ECML '93 Proceedings of the European Conference on Machine Learning
Learning with many irrelevant features
AAAI'91 Proceedings of the ninth National conference on Artificial intelligence - Volume 2
The role of word sense disambiguation in automated text categorization
NLDB'05 Proceedings of the 10th international conference on Natural Language Processing and Information Systems
Wrapping the Naive Bayes classifier to relax the effect of dependences
IDEAL'07 Proceedings of the 8th international conference on Intelligent data engineering and automated learning
Hi-index | 0.00 |
Feature subset selection using a wrapper means to perform a search for an optimal set of attributes using the Machine Learning Algorithm as a black box. The Naive Bayes Classifier is based on the assumption of independence among the values of the attributes given the class value. Consequently, its effectiveness may decrease when the attributes are interdependent. We present FBL, a wrapper that uses information about dependencies to guide the search for the optimal subset of features and we use the Naive Bayes Classifier as the black-box Machine Learning algorithm. Experimental results show that FBL allows the Naive Bayes Classifier to achieve greater accuracies, and that FBL performs better than other classical filters and wrappers.