Learning Multiple Tasks with Kernel Methods
The Journal of Machine Learning Research
Clustering with Bregman Divergences
The Journal of Machine Learning Research
Matrix Nearness Problems with Bregman Divergences
SIAM Journal on Matrix Analysis and Applications
Distance Metric Learning for Large Margin Nearest Neighbor Classification
The Journal of Machine Learning Research
Transfer metric learning by learning task relationships
Proceedings of the 16th ACM SIGKDD international conference on Knowledge discovery and data mining
Generalized sparse metric learning with relative comparisons
Knowledge and Information Systems
Geometry preserving multi-task metric learning
Machine Learning
Hi-index | 0.00 |
Multi-task learning has been widely studied in machine learning due to its capability to improve the performance of multiple related learning problems. However, few researchers have applied it on the important metric learning problem. In this paper, we propose to couple multiple related metric learning tasks with von Neumann divergence. On one hand, the novel regularized approach extends previous methods from the vector regularization to a general matrix regularization framework; on the other hand and more importantly, by exploiting von Neumann divergence as the regularizer, the new multi-task metric learning has the capability to well preserve the data geometry. This leads to more appropriate propagation of side-information among tasks and provides potential for further improving the performance. We propose the concept of geometry preserving probability (PG) and show that our framework leads to a larger PG in theory. In addition, our formulation proves to be jointly convex and the global optimal solution can be guaranteed. A series of experiments across very different disciplines verify that our proposed algorithm can consistently outperform the current methods.