Elements of information theory
Elements of information theory
SCG '94 Proceedings of the tenth annual symposium on Computational geometry
Agnostic classification of Markovian sequences
NIPS '97 Proceedings of the 1997 conference on Advances in neural information processing systems 10
Polynomial-time approximation schemes for geometric min-sum median clustering
Journal of the ACM (JACM)
Pairwise Data Clustering by Deterministic Annealing
IEEE Transactions on Pattern Analysis and Machine Intelligence
Optimal Cluster Preserving Embedding of Nonmetric Proximity Data
IEEE Transactions on Pattern Analysis and Machine Intelligence
Spectral Grouping Using the Nyström Method
IEEE Transactions on Pattern Analysis and Machine Intelligence
Distributional clustering of English words
ACL '93 Proceedings of the 31st annual meeting on Association for Computational Linguistics
Some inequalities for information divergence and related measures of discrimination
IEEE Transactions on Information Theory
A new metric for probability distributions
IEEE Transactions on Information Theory
Hi-index | 0.00 |
Organizing objects into groups based on their co-occurrence with a second, relevance variable has been widely studied with the Information Bottleneck (IB) as one of the most prominent representatives. We present a kernel-based approach to pairwise clustering of discrete histograms using the Jensen-Shannon (JS) divergence, which can be seen as a two-sampletest. This yields a cost criterion with a solid information-theoretic justification, which can be approximated in polynomial time with arbitrary precision. In addition to that, a relation to optimal hard clustering IB solutions can be established. To our knowledge, we are the first to devise algorithms for the IB with provable approximation guaranties. In practice, one obtains convincing results in the context of image segmentation using fast optimization heuristics.