Bayesian classification (AutoClass): theory and results
Advances in knowledge discovery and data mining
Machine Learning - Special issue on learning with probabilistic representations
Lazy Learning of Bayesian Rules
Machine Learning
Not So Naive Bayes: Aggregating One-Dependence Estimators
Machine Learning
Naive Bayes models for probability estimation
ICML '05 Proceedings of the 22nd international conference on Machine learning
Classification using Hierarchical Naïve Bayes models
Machine Learning
Ensemble selection for superparent-one-dependence estimators
AI'05 Proceedings of the 18th Australian Joint conference on Advances in Artificial Intelligence
Scaling up the accuracy of Bayesian classifier based on frequent itemsets by m-estimate
AICI'10 Proceedings of the 2010 international conference on Artificial intelligence and computational intelligence: Part I
Double-layer bayesian classifier ensembles based on frequent itemsets
International Journal of Automation and Computing
Domains of competence of the semi-naive Bayesian network classifiers
Information Sciences: an International Journal
Hi-index | 0.00 |
Among the several attempts to improve the Naive Bayes (NB) classifier, the Aggregating One-Dependence Estimators (AODE) has proved to be one of the most attractive, considering not only the low error it provides but also its efficiency. AODE estimates the corresponding parameters for every SPODE (Superparent-One-Dependence Estimators) using each attribute of the database as the superparent, and uniformly averages them all. Nevertheless, AODE has properties that can be improved. Firstly, the need to store all the models constructed leads to a high demand on space and hence, to the impossibility of dealing with problems of high dimensionality; secondly, even though it is fast, the computational time required for the training and the classification time is quadratic in the number of attributes. This is specially significant in the classification time, as it is frequently carried out in real time. In this paper, we propose the HODE classifier as an alternative approach to AODE in order to alleviate its problems by estimating a new variable (the hidden variable) as a superparent besides the class, whose main objective is to gather all the dependences existing in the AODE models. The results obtained show that this new algorithm provides similar results in terms of accuracy with a reduction in classification time and space complexity.