C4.5: programs for machine learning
C4.5: programs for machine learning
The Case against Accuracy Estimation for Comparing Induction Algorithms
ICML '98 Proceedings of the Fifteenth International Conference on Machine Learning
Tree Induction for Probability-Based Ranking
Machine Learning
Multiple labels associative classification
Knowledge and Information Systems
Learning probabilistic decision trees for AUC
Pattern Recognition Letters - Special issue: ROC analysis in pattern recognition
Data Mining: Practical Machine Learning Tools and Techniques, Second Edition (Morgan Kaufmann Series in Data Management Systems)
Ranking-based evaluation of regression models
Knowledge and Information Systems
Top 10 algorithms in data mining
Knowledge and Information Systems
UAI'03 Proceedings of the Nineteenth conference on Uncertainty in Artificial Intelligence
Instance cloning local naive bayes
AI'05 Proceedings of the 18th Canadian Society conference on Advances in Artificial Intelligence
Learning random forests for ranking
Frontiers of Computer Science in China
Improving Tree augmented Naive Bayes for class probability estimation
Knowledge-Based Systems
Not so greedy: Randomly Selected Naive Bayes
Expert Systems with Applications: An International Journal
Editorial: Modifications of the construction and voting mechanisms of the Random Forests Algorithm
Data & Knowledge Engineering
Slash-based relevance propagation model for topic distillation
Journal of Web Engineering
Hi-index | 0.00 |
Decision tree is one of the most effective and widely used methods for classification. However, many real-world applications require instances to be ranked by the probability of class membership. The area under the receiver operating characteristics curve, simply AUC, has been recently used as a measure for ranking performance of learning algorithms. In this paper, we present two novel class probability estimation algorithms to improve the ranking performance of decision tree. Instead of estimating the probability of class membership using simple voting at the leaf where the test instance falls into, our algorithms use similarity-weighted voting and naive Bayes. We design empirical experiments to verify that our new algorithms significantly outperform the recent decision tree ranking algorithm C4.4 in terms of AUC.