A Comparison of Ranking Methods for Classification Algorithm Selection
ECML '00 Proceedings of the 11th European Conference on Machine Learning
Nonparametric Supervised Learning by Linear Interpolation with Maximum Entropy
IEEE Transactions on Pattern Analysis and Machine Intelligence
Multilabel Neural Networks with Applications to Functional Genomics and Text Categorization
IEEE Transactions on Knowledge and Data Engineering
ML-KNN: A lazy learning approach to multi-label learning
Pattern Recognition
Top 10 algorithms in data mining
Knowledge and Information Systems
On kernel difference-weighted k-nearest neighbor classification
Pattern Analysis & Applications - Special Issue: Non-parametric distance-based classification techniques and their applications
Ml-rbf: RBF Neural Networks for Multi-Label Learning
Neural Processing Letters
Feature selection for multi-label naive Bayes classification
Information Sciences: an International Journal
Similarity-based Classification: Concepts and Algorithms
The Journal of Machine Learning Research
Hi-index | 0.00 |
Multi-label classification is a special learning task in which any instance is possibly associated with multiple classes simultaneously. How to design and implement efficient and effective multi-label algorithms is a challenging issue. The k-nearest neighbor (kNN) method and its weighted form (WkNN) are simple but effective for binary and multi-class classification. In this paper, we construct a weighted kNN version for multi-label classification (MLC-WkNN) according to Bayesian theorem. Through approximating a query instance by the linear weighted sum of k-nearest neighbors in terms of least squared error, the weights are determined adaptively by solving a quadratic programming with a unit simplex constraint. Specially, our MLC-WkNN is still a model-free and instance-based learning technique and only involves a tunable parameter k. Experimental study on two benchmark data sets (Image and Yeast) illustrates that our MLC-WkNN outperforms seven existing high-performed multi-label algorithms.