Machine Learning - Special issue on learning with probabilistic representations
Data mining: practical machine learning tools and techniques with Java implementations
Data mining: practical machine learning tools and techniques with Java implementations
MultiBoosting: A Technique for Combining Boosting and Wagging
Machine Learning
Lazy Learning of Bayesian Rules
Machine Learning
Induction of Recursive Bayesian Classifiers
ECML '93 Proceedings of the European Conference on Machine Learning
Adjusted Probability Naive Bayesian Induction
AI '98 Selected papers from the 11th Australian Joint Conference on Artificial Intelligence on Advanced Topics in Artificial Intelligence
Candidate Elimination Criteria for Lazy Bayesian Rules
AI '01 Proceedings of the 14th Australian Joint Conference on Artificial Intelligence: Advances in Artificial Intelligence
Not So Naive Bayes: Aggregating One-Dependence Estimators
Machine Learning
AAAI'05 Proceedings of the 20th national conference on Artificial intelligence - Volume 2
Ensemble selection for superparent-one-dependence estimators
AI'05 Proceedings of the 18th Australian Joint conference on Advances in Artificial Intelligence
Robust bayesian linear classifier ensembles
ECML'05 Proceedings of the 16th European conference on Machine Learning
IEEE Transactions on Knowledge and Data Engineering
Discriminatively Learning Selective Averaged One-Dependence Estimators Based on Cross-Entropy Method
Computational Intelligence and Security
Finding the Right Family: Parent and Child Selection for Averaged One-Dependence Estimators
ECML '07 Proceedings of the 18th European conference on Machine Learning
Instance Selection by Border Sampling in Multi-class Domains
ADMA '09 Proceedings of the 5th International Conference on Advanced Data Mining and Applications
The WEKA data mining software: an update
ACM SIGKDD Explorations Newsletter
The Knowledge Engineering Review
Using syntactic and semantic based relations for dialogue act recognition
COLING '10 Proceedings of the 23rd International Conference on Computational Linguistics: Posters
Computational Biology and Chemistry
To select or to weigh: a comparative study of model selection and model weighing for SPODE ensembles
ECML'06 Proceedings of the 17th European conference on Machine Learning
Cascading customized naïve bayes couple
AI'10 Proceedings of the 23rd Canadian conference on Advances in Artificial Intelligence
Hi-index | 0.00 |
Semi-naive Bayesian classifiers seek to retain the numerous strengths of naive Bayes while reducing error by relaxing the attribute independence assumption. Backwards Sequential Elimination (BSE) is a wrapper technique for attribute elimination that has proved effective at this task. We explore a new technique, Lazy Elimination (LE), which eliminates highly related attribute-values at classification time without the computational overheads inherent in wrapper techniques. We analyze the effect of LE and BSE on a state-of-the-art semi-naive Bayesian algorithm Averaged One-Dependence Estimators (AODE). Our experiments show that LE significantly reduces bias and error without undue computation, while BSE significantly reduces bias but not error, with high training time complexity. In the context of AODE, LE has a significant advantage over BSE in both computational efficiency and error.