On the Optimality of the Simple Bayesian Classifier under Zero-One Loss
Machine Learning - Special issue on learning with probabilistic representations
Machine Learning
Ranking with Predictive Clustering Trees
ECML '02 Proceedings of the 13th European Conference on Machine Learning
Cranking: Combining Rankings Using Conditional Probability Models on Permutations
ICML '02 Proceedings of the Nineteenth International Conference on Machine Learning
Constraint Classification: A New Approach to Multiclass Classification
ALT '02 Proceedings of the 13th International Conference on Algorithmic Learning Theory
Label ranking by learning pairwise preferences
Artificial Intelligence
Decision tree and instance-based learning for label ranking
ICML '09 Proceedings of the 26th Annual International Conference on Machine Learning
Metalearning: Applications to Data Mining
Metalearning: Applications to Data Mining
Mining association rules for label ranking
PAKDD'11 Proceedings of the 15th Pacific-Asia conference on Advances in knowledge discovery and data mining - Volume Part II
Multilayer perceptron for label ranking
ICANN'12 Proceedings of the 22nd international conference on Artificial Neural Networks and Machine Learning - Volume Part II
Hi-index | 0.00 |
The problem of learning label rankings is receiving increasing attention from several research communities. A number of common learning algorithms have been adapted for this task, including k-Nearest Neighbours (k-NN) and decision trees. Following this line, we propose an adaptation of the naive Bayes classification algorithm for the label ranking problem. Our main idea lies in the use of similarity between the rankings to replace the concept of probability. We empirically test the proposed method on some metalearning problems that consist of relating characteristics of learning problems to the relative performance of learning algorithms. Our method generally performs better than the baseline indicating that it is able to identify some of the underlying patterns in the data.