A Hierarchical Latent Variable Model for Data Visualization
IEEE Transactions on Pattern Analysis and Machine Intelligence
Intrinsic Dimensionality Estimation With Optimally Topology Preserving Maps
IEEE Transactions on Pattern Analysis and Machine Intelligence
GTM: the generative topographic mapping
Neural Computation
Hierarchical GTM: Constructing Localized Nonlinear Projection Manifolds in a Principled Way
IEEE Transactions on Pattern Analysis and Machine Intelligence
Laplacian Eigenmaps for dimensionality reduction and data representation
Neural Computation
Think globally, fit locally: unsupervised learning of low dimensional manifolds
The Journal of Machine Learning Research
IJCAI'05 Proceedings of the 19th international joint conference on Artificial intelligence
Hi-index | 0.01 |
Several manifold learning techniques have been developed to learn, given a data, a single lower dimensional manifold providing a compact representation of the original data. However, for complex data sets containing multiple manifolds of possibly of different dimensionalities, it is unlikely that the existing manifold learning approaches can discover all the interesting lower-dimensional structures. We therefore introduce a hierarchical manifolds learning framework to discover a variety of the underlying low dimensional structures. The framework is based on hierarchical mixture latent variable model, in which each submodel is a latent variable model capturing a single manifold. We propose a novel multiple manifold approximation strategy used for the initialization of our hierarchical model. The technique is first verified on artificial data with mixed 1 ï戮驴, 2 ï戮驴 and 3 ï戮驴dimensional structures. It is then used to automatically detect lower-dimensional structures in disrupted satellite galaxies.