Bumptrees for efficient function, constraint, and classification learning
NIPS-3 Proceedings of the 1990 conference on Advances in neural information processing systems 3
Elements of information theory
Elements of information theory
A decision-theoretic generalization of on-line learning and an application to boosting
Journal of Computer and System Sciences - Special issue: 26th annual ACM symposium on the theory of computing & STOC'94, May 23–25, 1994, and second annual Europe an conference on computational learning theory (EuroCOLT'95), March 13–15, 1995
Boosting as entropy projection
COLT '99 Proceedings of the twelfth annual conference on Computational learning theory
Data structures and algorithms for nearest neighbor search in general metric spaces
SODA '93 Proceedings of the fourth annual ACM-SIAM Symposium on Discrete algorithms
ACM Transactions on Graphics (TOG)
Information Theory: Coding Theorems for Discrete Memoryless Systems
Information Theory: Coding Theorems for Discrete Memoryless Systems
Blobworld: Image Segmentation Using Expectation-Maximization and Its Application to Image Querying
IEEE Transactions on Pattern Analysis and Machine Intelligence
Empirical Evaluation of Dissimilarity Measures for Color and Texture
ICCV '99 Proceedings of the International Conference on Computer Vision-Volume 2 - Volume 2
A Metric for Distributions with Applications to Image Databases
ICCV '98 Proceedings of the Sixth International Conference on Computer Vision
ICCV '03 Proceedings of the Ninth IEEE International Conference on Computer Vision - Volume 2
Wavelet-based texture retrieval using generalized Gaussian density and Kullback-Leibler distance
IEEE Transactions on Image Processing
Fast nearest neighbor retrieval for bregman divergences
Proceedings of the 25th international conference on Machine learning
Approximate bregman near neighbors in sublinear time: beyond the triangle inequality
Proceedings of the twenty-eighth annual symposium on Computational geometry
Hi-index | 0.00 |
Efficient retrieval often requires an indexing structure on the database in question. We present an indexing scheme for cases when the dissimilarity measure is the Kullback-Liebler (KL) divergence. Devising such a scheme is difficult because the KL-divergence is not a metric, failing to satisfy the triangle inequality or even .niteness in general. We de.ne an optimal represenative of a set of distributions to serve as the basis of such an indexing structure. This representative, dubbed the exponential information theoretic center, minimizes the worst case KLdivergence from it to the elements of its set. This, along with a lower bound on the KL-divergence from the query to the elements of a set, allows us to prune the search, increasing e.ciency while guarenteeing that we never discard the nearest neighbors. We present results of querying the Princeton Shape Database which show significant speed-ups over an exhaustive search and over an analogous approach using a more mundane representative.