Matrix analysis
Topics in matrix analysis
Machine Learning - Special issue on inductive transfer
Convergence of a block coordinate descent method for nondifferentiable minimization
Journal of Optimization Theory and Applications
Convex Optimization
Regularized multi--task learning
Proceedings of the tenth ACM SIGKDD international conference on Knowledge discovery and data mining
Clustering with Bregman Divergences
The Journal of Machine Learning Research
Information-theoretic metric learning
Proceedings of the 24th international conference on Machine learning
Matrix Nearness Problems with Bregman Divergences
SIAM Journal on Matrix Analysis and Applications
Fast solvers and efficient implementations for distance metric learning
Proceedings of the 25th international conference on Machine learning
Convex multi-task feature learning
Machine Learning
Flexible latent variable models for multi-task learning
Machine Learning
Distance Metric Learning for Large Margin Nearest Neighbor Classification
The Journal of Machine Learning Research
Low-Rank Kernel Learning with Bregman Matrix Divergences
The Journal of Machine Learning Research
GSML: A Unified Framework for Sparse Metric Learning
ICDM '09 Proceedings of the 2009 Ninth IEEE International Conference on Data Mining
Transfer metric learning by learning task relationships
Proceedings of the 16th ACM SIGKDD international conference on Knowledge discovery and data mining
Quantum Computation and Quantum Information: 10th Anniversary Edition
Quantum Computation and Quantum Information: 10th Anniversary Edition
Generalized sparse metric learning with relative comparisons
Knowledge and Information Systems
Multi-task low-rank metric learning based on common subspace
ICONIP'11 Proceedings of the 18th international conference on Neural Information Processing - Volume Part II
Distance metric learning with eigenvalue optimization
The Journal of Machine Learning Research
Geometry preserving multi-task metric learning
ECML PKDD'12 Proceedings of the 2012 European conference on Machine Learning and Knowledge Discovery in Databases - Volume Part I
Hi-index | 0.00 |
In this paper, we consider the multi-task metric learning problem, i.e., the problem of learning multiple metrics from several correlated tasks simultaneously. Despite the importance, there are only a limited number of approaches in this field. While the existing methods often straightforwardly extend existing vector-based methods, we propose to couple multiple related metric learning tasks with the von Neumann divergence. On one hand, the novel regularized approach extends previous methods from the vector regularization to a general matrix regularization framework; on the other hand and more importantly, by exploiting von Neumann divergence as the regularization, the new multi-task metric learning method has the capability to well preserve the data geometry. This leads to more appropriate propagation of side-information among tasks and provides potential for further improving the performance. We propose the concept of geometry preserving probability and show that our framework encourages a higher geometry preserving probability in theory. In addition, our formulation proves to be jointly convex and the global optimal solution can be guaranteed. We have conducted extensive experiments on six data sets (across very different disciplines), and the results verify that our proposed approach can consistently outperform almost all the current methods.