On the Optimality of the Simple Bayesian Classifier under Zero-One Loss
Machine Learning - Special issue on learning with probabilistic representations
Machine Learning - Special issue on learning with probabilistic representations
Data mining: practical machine learning tools and techniques with Java implementations
Data mining: practical machine learning tools and techniques with Java implementations
MultiBoosting: A Technique for Combining Boosting and Wagging
Machine Learning
Lazy Learning of Bayesian Rules
Machine Learning
Scientific Computing
On Bias, Variance, 0/1—Loss, and the Curse-of-Dimensionality
Data Mining and Knowledge Discovery
Machine Learning
Semi-Naive Bayesian Classifier
EWSL '91 Proceedings of the European Working Session on Machine Learning
Induction of Recursive Bayesian Classifiers
ECML '93 Proceedings of the European Conference on Machine Learning
ICML '99 Proceedings of the Sixteenth International Conference on Machine Learning
Bayesian Averaging of Classifiers and the Overfitting Problem
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
Adjusted Probability Naive Bayesian Induction
AI '98 Selected papers from the 11th Australian Joint Conference on Artificial Intelligence on Advanced Topics in Artificial Intelligence
Candidate Elimination Criteria for Lazy Bayesian Rules
AI '01 Proceedings of the 14th Australian Joint Conference on Artificial Intelligence: Advances in Artificial Intelligence
SNNB: A Selective Neighborhood Based Naïve Bayes for Lazy Learning
PAKDD '02 Proceedings of the 6th Pacific-Asia Conference on Advances in Knowledge Discovery and Data Mining
Bayesian Artificial Intelligence
Bayesian Artificial Intelligence
Ensemble selection from libraries of models
ICML '04 Proceedings of the twenty-first international conference on Machine learning
Not So Naive Bayes: Aggregating One-Dependence Estimators
Machine Learning
Augmenting naive Bayes for ranking
ICML '05 Proceedings of the 22nd international conference on Machine learning
Classifying Requirements: Towards a More Rigorous Analysis of Natural-Language Specifications
ISSRE '05 Proceedings of the 16th IEEE International Symposium on Software Reliability Engineering
Efficient lazy elimination for averaged one-dependence estimators
ICML '06 Proceedings of the 23rd international conference on Machine learning
A Novel One-dependence Estimator Based on Multi-parents
ISDA '06 Proceedings of the Sixth International Conference on Intelligent Systems Design and Applications - Volume 01
A Comparison of Decision Tree Ensemble Creation Techniques
IEEE Transactions on Pattern Analysis and Machine Intelligence
Statistical Comparisons of Classifiers over Multiple Data Sets
The Journal of Machine Learning Research
Representing conditional independence using decision trees
AAAI'05 Proceedings of the 20th national conference on Artificial intelligence - Volume 2
AAAI'05 Proceedings of the 20th national conference on Artificial intelligence - Volume 2
Learning Bayesian networks with local structure
UAI'96 Proceedings of the Twelfth international conference on Uncertainty in artificial intelligence
To select or to weigh: a comparative study of model selection and model weighing for SPODE ensembles
ECML'06 Proceedings of the 17th European conference on Machine Learning
Ensemble selection for superparent-one-dependence estimators
AI'05 Proceedings of the 18th Australian Joint conference on Advances in Artificial Intelligence
Robust bayesian linear classifier ensembles
ECML'05 Proceedings of the 16th European conference on Machine Learning
Switching between selection and fusion in combining classifiers: anexperiment
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Incremental construction of classifier and discriminant ensembles
Information Sciences: an International Journal
Graph-Based model-selection framework for large ensembles
HAIS'10 Proceedings of the 5th international conference on Hybrid Artificial Intelligence Systems - Volume Part I
Information Technology and Management
An ensemble of Bayesian networks for multilabel classification
IJCAI'13 Proceedings of the Twenty-Third international joint conference on Artificial Intelligence
Credal ensembles of classifiers
Computational Statistics & Data Analysis
Hi-index | 0.00 |
We conduct a large-scale comparative study on linearly combining superparent-one-dependence estimators (SPODEs), a popular family of semi-naive Bayesian classifiers. Altogether 16 model selection and weighing schemes, 58 benchmark data sets, as well as various statistical tests are employed. This paper's main contributions are three-fold. First, it formally presents each scheme's definition, rationale and time complexity; and hence can serve as a comprehensive reference for researchers interested in ensemble learning. Second, it offers bias-variance analysis for each scheme's classification error performance. Third, it identifies effective schemes that meet various needs in practice. This leads to accurate and fast classification algorithms with immediate and significant impact on real-world applications. Another important feature of our study is using a variety of statistical tests to evaluate multiple learning methods across multiple data sets.