A practical approach to feature selection
ML92 Proceedings of the ninth international workshop on Machine learning
C4.5: programs for machine learning
C4.5: programs for machine learning
Machine Learning - Special issue on learning with probabilistic representations
Machine Learning
Feature selection for high-dimensional genomic microarray data
ICML '01 Proceedings of the Eighteenth International Conference on Machine Learning
The Case against Accuracy Estimation for Comparing Induction Algorithms
ICML '98 Proceedings of the Fifteenth International Conference on Machine Learning
Correlation-based Feature Selection for Discrete and Numeric Class Machine Learning
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
SNNB: A Selective Neighborhood Based Naïve Bayes for Lazy Learning
PAKDD '02 Proceedings of the 6th Pacific-Asia Conference on Advances in Knowledge Discovery and Data Mining
Tree Induction for Probability-Based Ranking
Machine Learning
Theoretical and Empirical Analysis of ReliefF and RReliefF
Machine Learning
Learning Weighted Naive Bayes with Accurate Ranking
ICDM '04 Proceedings of the Fourth IEEE International Conference on Data Mining
Not So Naive Bayes: Aggregating One-Dependence Estimators
Machine Learning
Data Mining: Practical Machine Learning Tools and Techniques, Second Edition (Morgan Kaufmann Series in Data Management Systems)
A decision tree-based attribute weighting filter for naive Bayes
Knowledge-Based Systems
AAAI'05 Proceedings of the 20th national conference on Artificial intelligence - Volume 2
Weightily averaged one-dependence estimators
PRICAI'06 Proceedings of the 9th Pacific Rim international conference on Artificial intelligence
Induction of selective Bayesian classifiers
UAI'94 Proceedings of the Tenth international conference on Uncertainty in artificial intelligence
Lazy averaged one-dependence estimators
AI'06 Proceedings of the 19th international conference on Advances in Artificial Intelligence: Canadian Society for Computational Studies of Intelligence
Hi-index | 0.00 |
One related area that has received little attention with regards to AODE is the use of attribute weights for ranking. This paper investigates how to learn an AWAODE with accurate ranking from data sets. We first explore various methods, such as gain ratio, correlation-based feature selection attribute selection algorithm, mutual information and relief attribute ranking algorithm. Our experiments clearly show that an attribute weighted AODE trained to produce AUC ranking outperforms AODE and NB. Then, we propose a new approach to weight AODE for generating accurate ranking, called decision tree-based attribute weighted averaged one-dependence estimator, simply DTWAODE. In DTWAODE, the weight for an attribute is set according to its depth in the decision tree building on the training samples. The experimental results show that our new attribute weighted model via AODE performance effectively than AODE and other attribute weighted approaches on AUC.