ACM Computing Surveys (CSUR)
Normalized Cuts and Image Segmentation
IEEE Transactions on Pattern Analysis and Machine Intelligence
Minimum-Entropy Data Partitioning Using Reversible Jump Markov Chain Monte Carlo
IEEE Transactions on Pattern Analysis and Machine Intelligence
Mutual Information in Learning Feature Transformations
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
Optimal Cluster Preserving Embedding of Nonmetric Proximity Data
IEEE Transactions on Pattern Analysis and Machine Intelligence
LEGClust—A Clustering Algorithm Based on Layered Entropic Subgraphs
IEEE Transactions on Pattern Analysis and Machine Intelligence
Hi-index | 0.10 |
This paper addresses a new method and aspect of information-theoretic clustering where we exploit the minimum entropy principle and the quadratic distance measure between probability densities. We present a new minimum entropy objective function which leads to the maximization of within-cluster association. A simple implementation using the gradient ascent method is given. In addition, we show that the minimum entropy principle leads to the objective function of the k-means clustering, and the maximum within-cluster association is closed related to the spectral clustering which is an eigen-decomposition-based method. This information-theoretic view of spectral clustering leads us to use the kernel density estimation method in constructing an affinity matrix.