A Riemannian Framework for Tensor Computing
International Journal of Computer Vision
Clustering with Bregman Divergences
The Journal of Machine Learning Research
Clustering and Embedding Using Commute Times
IEEE Transactions on Pattern Analysis and Machine Intelligence
Pedestrian Detection via Classification on Riemannian Manifolds
IEEE Transactions on Pattern Analysis and Machine Intelligence
Nonextensive Information Theoretic Kernels on Measures
The Journal of Machine Learning Research
Information Theory in Computer Vision and Pattern Recognition
Information Theory in Computer Vision and Pattern Recognition
Group-Wise Point-Set Registration Using a Novel CDF-Based Havrda-Charvát Divergence
International Journal of Computer Vision
Point Set Registration: Coherent Point Drift
IEEE Transactions on Pattern Analysis and Machine Intelligence
Information-geometric graph indexing from bags of partial node coverages
GbRPR'11 Proceedings of the 8th international conference on Graph-based representations in pattern recognition
From points to nodes: inverse graph embedding through a lagrangian formulation
CAIP'11 Proceedings of the 14th international conference on Computer analysis of images and patterns - Volume Part I
Graph matching through entropic manifold alignment
CVPR '11 Proceedings of the 2011 IEEE Conference on Computer Vision and Pattern Recognition
Geodesic entropic graphs for dimension and entropy estimation in manifold learning
IEEE Transactions on Signal Processing
Information-theoretic selection of high-dimensional spectral features for structural recognition
Computer Vision and Image Understanding
Hi-index | 0.00 |
This is a survey paper in which we explore the connection between graph representations and dissimilarity measures from an information-theoretic perspective. Firstly, we pose graph comparison (or indexing) in terms of entropic manifold alignment. In this regard, graphs are encoded by multi-dimensional point clouds resulting from their embedding. Once these point clouds are aligned, we explore several dissimilarity measures: multi-dimensional statistical tests (such as the Henze-Penrose Divergence and the Total Variation k-dP Divergence), the Symmetrized Normalized Entropy Square variation (SNESV) and Mutual Information. Most of the latter divergences rely on multi-dimensional entropy estimators. Secondly, we address the representation of graphs in terms of populations of tensors resulting from characterizing topological multi-scale subgraphs in terms of covariances of informative spectral features. Such covariances are mapped to a proper tangent space and then considered zero-mean Gaussian distributions. Therefore each graph can be encoded by a linear combination of Gaussians where the coefficients of the combination rely on unbiased geodesics. Distributional graph representations allows us to exploit a large family of dissimilarities used in information theory. We will focus on Bregman divergences (particularly Total Bregman Divergences) based on the Jensen-Shannon and Jensen-Rényi divergences. This latter approach is referred to as tensor-based distributional comparison for distributions can be also estimated from embeddings through Gaussian mixtures.