Algorithms for clustering data
Algorithms for clustering data
Parallel and Deterministic Algorithms from MRFs: Surface Reconstruction
IEEE Transactions on Pattern Analysis and Machine Intelligence
Vector quantization and signal compression
Vector quantization and signal compression
Elements of information theory
Elements of information theory
Complexity optimized data clustering by competitive neural networks
Neural Computation
Hierarchical mixtures of experts and the EM algorithm
Neural Computation
Statistical physics, mixtures of distributions, and the EM algorithm
Neural Computation
A non-greedy approach to tree-structured clustering
Pattern Recognition Letters
Clustering Algorithms
Constrained Clustering as an Optimization Method
IEEE Transactions on Pattern Analysis and Machine Intelligence
Neural Computation
An analysis of the elastic net approach to the traveling salesman problem
Neural Computation
Clustering by Scale-Space Filtering
IEEE Transactions on Pattern Analysis and Machine Intelligence
Clustering ensembles of neural network models
Neural Networks
A robust deterministic annealing algorithm for data clustering
Data & Knowledge Engineering
Data Clustering Using a Model Granular Magnet
Neural Computation
Hi-index | 0.00 |
We address unsupervised learning subject to structural constraints, with particular emphasis placed on clustering with an imposed decision tree structure. Most known methods are greedy, optimizing one node of the tree at a time to minimize a local cost. By constrast, we develop a joint optimization method, derived based on information-theoretic principles and closely related to known methods in statistical physics. The approach is inspired by the deterministic annealing algorithm for unstructured data clustering, which was based on maximum entropy inference. The new approach is founded on the principle of minimum cross-entropy, using informative priors to approximate the unstructured clustering solution while imposing the structural constraint. The resulting method incorporates supervised learning principles applied in an unsupervised problem setting. In our approach, the tree “grows” by a sequence of bifurcations that occur while optimizing an effective free energy cost at decreasing temperature scales. Thus, estimates of the tree size and structure are naturally obtained at each temperature in the process. Examples demonstrate considerable improvement over known methods.