MultiBoosting: A Technique for Combining Boosting and Wagging
Machine Learning
Lazy Learning of Bayesian Rules
Machine Learning
Induction of Recursive Bayesian Classifiers
ECML '93 Proceedings of the European Conference on Machine Learning
ICML '99 Proceedings of the Sixteenth International Conference on Machine Learning
Adjusted Probability Naive Bayesian Induction
AI '98 Selected papers from the 11th Australian Joint Conference on Artificial Intelligence on Advanced Topics in Artificial Intelligence
Building classifiers using Bayesian networks
AAAI'96 Proceedings of the thirteenth national conference on Artificial intelligence - Volume 2
Not so naive Bayes: aggregating one-dependence estimators
Machine Learning
Efficient lazy elimination for averaged one-dependence estimators
ICML '06 Proceedings of the 23rd international conference on Machine learning
IEEE Transactions on Knowledge and Data Engineering
Anytime learning and classification for online applications
Proceedings of the 2006 conference on Advances in Intelligent IT: Active Media Technology 2006
To select or to weigh: a comparative study of model selection and model weighing for SPODE ensembles
ECML'06 Proceedings of the 17th European conference on Machine Learning
Hi-index | 0.00 |
Lazy Bayesian Rules modifies naive Bayesian classification to undo elements of the harmful attribute independence assumption. It has been shown to provide classification error comparable to boosting decision trees. This paper explores alternatives to the candidate elimination criterion employed within Lazy Bayesian Rules. Improvements over naive Bayes are consistent so long as the candidate elimination criteria ensures there is sufficient data for accurate probability estimation. However, the original candidate elimination criterion is demonstrated to provide better overall error reduction than the use of a minimum data subset size criterion.