Machine Learning
Modeling the Shape of the Scene: A Holistic Representation of the Spatial Envelope
International Journal of Computer Vision
Diffusion Kernels on Graphs and Other Discrete Input Spaces
ICML '02 Proceedings of the Nineteenth International Conference on Machine Learning
Kernel independent component analysis
The Journal of Machine Learning Research
Convex Optimization
Learning a Mahalanobis Metric from Equivalence Constraints
The Journal of Machine Learning Research
Kernel Methods for Measuring Independence
The Journal of Machine Learning Research
Dimensionality Reduction of Multimodal Labeled Data by Local Fisher Discriminant Analysis
The Journal of Machine Learning Research
Information-theoretic metric learning
Proceedings of the 24th international conference on Machine learning
Regression on manifolds using kernel dimension reduction
Proceedings of the 24th international conference on Machine learning
A dependence maximization view of clustering
Proceedings of the 24th international conference on Machine learning
Supervised feature selection via dependence estimation
Proceedings of the 24th international conference on Machine learning
Nonlinear adaptive distance metric learning for clustering
Proceedings of the 13th ACM SIGKDD international conference on Knowledge discovery and data mining
An efficient algorithm for local distance metric learning
AAAI'06 Proceedings of the 21st national conference on Artificial intelligence - Volume 1
IEEE Transactions on Pattern Analysis and Machine Intelligence
Multilabel dimensionality reduction via dependence maximization
ACM Transactions on Knowledge Discovery from Data (TKDD)
Measuring statistical dependence with hilbert-schmidt norms
ALT'05 Proceedings of the 16th international conference on Algorithmic Learning Theory
Low Rank Metric Learning with Manifold Regularization
ICDM '11 Proceedings of the 2011 IEEE 11th International Conference on Data Mining
Hi-index | 0.01 |
Supervised distance metric learning plays a substantial role to the success of statistical classification and information retrieval. Although many related algorithms are proposed, it is still an open problem about incorporating both the geometric information (i.e., locality) and the label information (i.e., globality) in metric learning. In this paper, we propose a novel metric learning framework, called ''Dependence Maximization based Metric Learning'' (DMML), which can efficiently integrate these two sources of information into a unified structure as instances of convex programming without requiring balance weights. In DMML, the metric is trained by maximizing the dependence between data distributions in the reproducing kernel Hilbert spaces (RKHSs). Unlike learning in the existing information theoretic algorithms, however, DMML requires no estimation or assumption of data distributions. Under this proposed framework, we present two methods by employing different independence criteria respectively, i.e., Hilbert-Schmidt Independence Criterion and the generalized Distance Covariance. Comprehensive experimental results for classification, visualization and image retrieval demonstrate that DMML favorably outperforms state-of-the-art metric learning algorithms, meanwhile illustrate the respective advantages of these two proposed methods in the related applications.