Machine Learning - Special issue on learning with probabilistic representations
MultiBoosting: A Technique for Combining Boosting and Wagging
Machine Learning
Not So Naive Bayes: Aggregating One-Dependence Estimators
Machine Learning
Data Mining: Practical Machine Learning Tools and Techniques, Second Edition (Morgan Kaufmann Series in Data Management Systems)
Statistical Comparisons of Classifiers over Multiple Data Sets
The Journal of Machine Learning Research
GAODE and HAODE: two proposals based on AODE to deal with continuous variables
ICML '09 Proceedings of the 26th Annual International Conference on Machine Learning
Flexible learning of k-dependence Bayesian network classifiers
Proceedings of the 13th annual conference on Genetic and evolutionary computation
Hi-index | 0.00 |
Most of the methods designed within the framework of Bayesian networks (BNs) assume that the involved variables are of discrete nature, but this assumption rarely holds in real problems. The Bayesian classifier AODE (Aggregating One-Dependence Estimators) e.g. can only work directly with discrete variables. The HAODE (from Hybrid AODE) classifier is proposed as an appealing alternative to AODE which is less affected by the discretization process. In this paper, we study if this behavior holds when applying different discretization methods. More importantly, we include other Bayesian classifiers in the comparison to find out to what extent the type of discretization affects their results in terms of accuracy and bias-variance discretization. If the type of discretization applied is not decisive, then future experiments can be k times faster, k being the number of discretization methods considered.