On the Optimality of the Simple Bayesian Classifier under Zero-One Loss
Machine Learning - Special issue on learning with probabilistic representations
Estimating a mixture of two product distributions
COLT '99 Proceedings of the twelfth annual conference on Computational learning theory
Learning mixtures of product distributions over discrete domains
FOCS '05 Proceedings of the 46th Annual IEEE Symposium on Foundations of Computer Science
Learning to classify texts using positive and unlabeled data
IJCAI'03 Proceedings of the 18th international joint conference on Artificial intelligence
Learning from positive and unlabeled examples with different data distributions
ECML'05 Proceedings of the 16th European conference on Machine Learning
Automated Event Recognition for Football Commentary Generation
International Journal of Gaming and Computer-Mediated Simulations
Hi-index | 0.00 |
We address the problem of efficiently learning Naive Bayes classifiers under class-conditional classification noise (CCCN). Naive Bayes classifiers rely on the hypothesis that the distributions associated to each class are product distributions. When data is subject to CCC-noise, these conditional distributions are themselves mixtures of product distributions. We give analytical formulas which makes it possible to identify them from data subject to CCCN. Then, we design a learning algorithm based on these formulas able to learn Naive Bayes classifiers under CCCN. We present results on artificial datasets and datasets extracted from the UCI repository database. These results show that CCCN can be efficiently and successfully handled.