A decision-theoretic generalization of on-line learning and an application to boosting
Journal of Computer and System Sciences - Special issue: 26th annual ACM symposium on the theory of computing & STOC'94, May 23–25, 1994, and second annual Europe an conference on computational learning theory (EuroCOLT'95), March 13–15, 1995
On the Optimality of the Simple Bayesian Classifier under Zero-One Loss
Machine Learning - Special issue on learning with probabilistic representations
The Random Subspace Method for Constructing Decision Forests
IEEE Transactions on Pattern Analysis and Machine Intelligence
MultiBoosting: A Technique for Combining Boosting and Wagging
Machine Learning
Machine Learning
On the optimality of Naïve Bayes with dependent binary features
Pattern Recognition Letters
Data Mining: Practical Machine Learning Tools and Techniques, Second Edition (Morgan Kaufmann Series in Data Management Systems)
Statistical Comparisons of Classifiers over Multiple Data Sets
The Journal of Machine Learning Research
Classifier Ensembles with a Random Linear Oracle
IEEE Transactions on Knowledge and Data Engineering
Naive bayes classifiers that perform well with continuous variables
AI'04 Proceedings of the 17th Australian joint conference on Advances in Artificial Intelligence
Promoting Diversity in Gaussian Mixture Ensembles: An Application to Signature Verification
Biometrics and Identity Management
Ant Clustering Using Ensembles of Partitions
MCS '09 Proceedings of the 8th International Workshop on Multiple Classifier Systems
Logistic ensembles of Random Spherical Linear Oracles for microarray classification
International Journal of Data Mining and Bioinformatics
Naïve Bayes ensemble learning based on oracle selection
CCDC'09 Proceedings of the 21st annual international conference on Chinese Control and Decision Conference
IWANN'13 Proceedings of the 12th international conference on Artificial Neural Networks: advances in computational intelligence - Volume Part I
An investigation into the application of ensemble learning for entailment classification
Information Processing and Management: an International Journal
Hi-index | 0.00 |
Ensemble methods with Random Oracles have been proposed recently (Kuncheva and Rodríguez, 2007). A random-oracle classifier consists of a pair of classifiers and a fixed, randomly created oracle that selects between them. Ensembles of random-oracle decision trees were shown to fare better than standard ensembles. In that study, the oracle for a given tree was a random hyperplane at the root of the tree. The present work considers two random oracles types (linear and spherical) in ensembles of Naive Bayes Classifiers (NB). Our experiments show that ensembles based solely upon the spherical oracle (and no other ensemble heuristic) outrank Bagging, Wagging, Random Subspaces, AdaBoost.M1, MultiBoost and Decorate. Moreover, all these ensemble methods are better with any of the two random oracles than their standard versions without the oracles.