Probabilistic reasoning in intelligent systems: networks of plausible inference
Probabilistic reasoning in intelligent systems: networks of plausible inference
Readings in uncertain reasoning
Readings in uncertain reasoning
On the Optimality of the Simple Bayesian Classifier under Zero-One Loss
Machine Learning - Special issue on learning with probabilistic representations
Machine Learning - Special issue on learning with probabilistic representations
Lazy Learning of Bayesian Rules
Machine Learning
Machine Learning
Introduction to Bayesian Networks
Introduction to Bayesian Networks
Expert Systems and Probabiistic Network Models
Expert Systems and Probabiistic Network Models
Large-Sample Learning of Bayesian Networks is NP-Hard
The Journal of Machine Learning Research
Not So Naive Bayes: Aggregating One-Dependence Estimators
Machine Learning
OFFD: Optimal Flexible Frequency Discretization for Naïve Bayes Classification
ADMA '09 Proceedings of the 5th International Conference on Advanced Data Mining and Applications
A classification algorithm that derives weighted sum scores for insight into disease
HIKM '09 Proceedings of the Third Australasian Workshop on Health Informatics and Knowledge Management - Volume 97
An analysis of Bayesian classifiers
AAAI'92 Proceedings of the tenth national conference on Artificial intelligence
Induction of selective Bayesian classifiers
UAI'94 Proceedings of the Tenth international conference on Uncertainty in artificial intelligence
Incremental discretization for Naïve-Bayes classifier
ADMA'06 Proceedings of the Second international conference on Advanced Data Mining and Applications
Hi-index | 0.00 |
Naive Bayes classifier is the simplest among Bayesian Network classifiers. It has shown to be very efficient on a variety of data classification problems. However, the strong assumption that all features are conditionally independent given the class is often violated on many real world applications. Therefore, improvement of the Naive Bayes classifier by alleviating the feature independence assumption has attracted much attention. In this paper, we develop a new version of the Naive Bayes classifier without assuming independence of features. The proposed algorithm approximates the interactions between features by using conditional probabilities. We present results of numerical experiments on several real world data sets, where continuous features are discretized by applying two different methods. These results demonstrate that the proposed algorithm significantly improve the performance of the Naive Bayes classifier, yet at the same time maintains its robustness.