On the limited memory BFGS method for large scale optimization
Mathematical Programming: Series A and B
BoosTexter: A Boosting-based Systemfor Text Categorization
Machine Learning - Special issue on information retrieval
Large Margin Methods for Structured and Interdependent Output Variables
The Journal of Machine Learning Research
Fast maximum margin matrix factorization for collaborative prediction
ICML '05 Proceedings of the 22nd international conference on Machine learning
ML-KNN: A lazy learning approach to multi-label learning
Pattern Recognition
Incremental Algorithms for Hierarchical Classification
The Journal of Machine Learning Research
Uncovering shared structures in multiclass classification
Proceedings of the 24th international conference on Machine learning
LIBLINEAR: A Library for Large Linear Classification
The Journal of Machine Learning Research
Convex multi-task feature learning
Machine Learning
An accelerated gradient method for trace norm minimization
ICML '09 Proceedings of the 26th Annual International Conference on Machine Learning
IEEE Transactions on Pattern Analysis and Machine Intelligence
Distance metric learning with eigenvalue optimization
The Journal of Machine Learning Research
Matching pursuits with time-frequency dictionaries
IEEE Transactions on Signal Processing
Multilabel classification with principal label space transformation
Neural Computation
Hi-index | 0.10 |
Multi-label learning refers to methods for learning a classification function that predicts a set of relevant labels for an instance. Label embedding seeks a transformation which maps labels into a latent space where regression is performed to predict a set of relevant labels. The latent space is often a low-dimensional space, so computational and space complexities are reduced. However, the choice of an appropriate transformation to a latent space is not clear. In this paper we present a max-margin embedding method where both instances and labels are mapped into a low-dimensional latent space. In contrast to existing label embedding methods, the pair of instance and label embeddings is determined by minimizing a cost-sensitive multi-label hinge loss, in which label-dependent cost is applied to more penalize the misclassification of positive examples. For implementation, we employ the limited memory Broyden-Fletcher-Goldfarb-Shanno (BFGS) method to determine the instance and label embeddings by a joint optimization. Numerical experiments on a few datasets demonstrate the high performance of our method compared to existing embedding methods in the case where the dimensionality of the latent space is much smaller than that of the original label space.