On the Optimality of the Simple Bayesian Classifier under Zero-One Loss
Machine Learning - Special issue on learning with probabilistic representations
Object Tracking Using Naive Bayesian Classifiers
ACIVS '08 Proceedings of the 10th International Conference on Advanced Concepts for Intelligent Vision Systems
Naïve Bayes ensembles with a random oracle
MCS'07 Proceedings of the 7th international conference on Multiple classifier systems
Zero-day malware detection based on supervised learning algorithms of API call signatures
AusDM '11 Proceedings of the Ninth Australasian Data Mining Conference - Volume 121
Gabor wavelets combined with volumetric fractal dimension applied to texture analysis
Pattern Recognition Letters
Hi-index | 0.10 |
While Naive Bayes classifier (NB) is Bayes-optimal for independent features, we prove that it is also optimal for two equiprobable classes and two features with equal class-conditional covariances. Although strict optimality does not extend for three features, equal covariances are expected to be beneficial in higher-dimensional spaces.