Laplacian Eigenmaps for dimensionality reduction and data representation
Neural Computation
A tutorial on spectral clustering
Statistics and Computing
An iterative image registration technique with an application to stereo vision
IJCAI'81 Proceedings of the 7th international joint conference on Artificial intelligence - Volume 2
Relaxed exponential kernels for unsupervised learning
DAGM'11 Proceedings of the 33rd international conference on Pattern recognition
The Sparse Matrix Transform for Covariance Estimation and Analysis of High Dimensional Signals
IEEE Transactions on Image Processing
Hi-index | 0.00 |
Multivariate Gaussian densities are pervasive in pattern recognition and machine learning. A central operation that appears in most of these areas is to measure the difference between two multivariate Gaussians. Unfortunately, traditional measures based on the Kullback---Leibler (KL) divergence and the Bhattacharyya distance do not satisfy all metric axioms necessary for many algorithms. In this paper we propose a modification for the KL divergence and the Bhattacharyya distance, for multivariate Gaussian densities, that transforms the two measures into distance metrics. Next, we show how these metric axioms impact the unfolding process of manifold learning algorithms. Finally, we illustrate the efficacy of the proposed metrics on two different manifold learning algorithms when used for motion clustering in video data. Our results show that, in this particular application, the new proposed metrics lead to significant boosts in performance (at least 7%) when compared to other divergence measures.