Machine Learning
Wrappers for feature subset selection
Artificial Intelligence - Special issue on relevance
Machine Learning
Correlation-based Feature Selection for Discrete and Numeric Class Machine Learning
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
Tree Induction for Probability-Based Ranking
Machine Learning
Inference for the Generalization Error
Machine Learning
Active Sampling for Class Probability Estimation and Ranking
Machine Learning
Learning Bayesian network classifiers by maximizing conditional likelihood
ICML '04 Proceedings of the twenty-first international conference on Machine learning
Efficient Feature Selection via Analysis of Relevance and Redundancy
The Journal of Machine Learning Research
Data Mining: Practical Machine Learning Tools and Techniques, Second Edition (Morgan Kaufmann Series in Data Management Systems)
A decision tree-based attribute weighting filter for naive Bayes
Knowledge-Based Systems
Learning decision tree for ranking
Knowledge and Information Systems
Improving algorithms for structure learning in Bayesian Networks using a new implicit score
Expert Systems with Applications: An International Journal
Random one-dependence estimators
Pattern Recognition Letters
Constructing the Bayesian network structure from dependencies implied in multiple relational schemas
Expert Systems with Applications: An International Journal
Induction of selective Bayesian classifiers
UAI'94 Proceedings of the Tenth international conference on Uncertainty in artificial intelligence
Evolutionary attribute ordering in Bayesian networks for predicting the metabolic syndrome
Expert Systems with Applications: An International Journal
Learning naive bayes for probability estimation by feature selection
AI'06 Proceedings of the 19th international conference on Advances in Artificial Intelligence: Canadian Society for Computational Studies of Intelligence
Hi-index | 12.05 |
Many approaches are proposed to improve Naive Bayes, among which the attribute selection approach has demonstrated remarkable performance. Algorithms for attribute selection fall into two broad categories: filters and wrappers. Filters use the general data characteristics to evaluate the selected attribute subset before the learning algorithm is run, while wrappers use the learning algorithm itself as a black box to evaluate the selected attribute subset. In this paper, we work on the attribute selection approach of wrapper and propose an improved Naive Bayes algorithm by carrying a random search through the whole space of attributes. We simply called it Randomly Selected Naive Bayes (RSNB). In order to meet the need of classification, ranking, and class probability estimation, we discriminatively design three different versions: RSNB-ACC, RSNB-AUC, and RSNB-CLL. The experimental results based on a large number of UCI datasets validate their effectiveness in terms of classification accuracy (ACC), area under the ROC curve (AUC), and conditional log likelihood (CLL), respectively.