Hierarchical mixtures of experts and the EM algorithm
Neural Computation
Artificial Intelligence Review - Special issue on lazy learning
Dimension reduction by local principal component analysis
Neural Computation
GTM: the generative topographic mapping
Neural Computation
A unifying review of linear Gaussian models
Neural Computation
Mixtures of probabilistic principal component analyzers
Neural Computation
Self-Organizing Maps
ICANN 96 Proceedings of the 1996 International Conference on Artificial Neural Networks
Think globally, fit locally: unsupervised learning of low dimensional manifolds
The Journal of Machine Learning Research
Two-Dimensional PCA: A New Approach to Appearance-Based Face Representation and Recognition
IEEE Transactions on Pattern Analysis and Machine Intelligence
Principal components analysis competitive learning
Neural Computation
Acquiring Linear Subspaces for Face Recognition under Variable Lighting
IEEE Transactions on Pattern Analysis and Machine Intelligence
Differential Log Likelihood for Evaluating and Learning Gaussian Mixtures
Neural Computation
Knowledge and Information Systems
General Tensor Discriminant Analysis and Gabor Features for Gait Recognition
IEEE Transactions on Pattern Analysis and Machine Intelligence
Journal of Cognitive Neuroscience
Learning multiple linear manifolds with self-organizing networks
International Journal of Parallel, Emergent and Distributed Systems
Neurocomputing
Minimum effective dimension for mixtures of subspaces: a robust GPCA algorithm and its applications
CVPR'04 Proceedings of the 2004 IEEE computer society conference on Computer vision and pattern recognition
Modeling the manifolds of images of handwritten digits
IEEE Transactions on Neural Networks
Handwritten digit recognition by adaptive-subspace self-organizing map (ASSOM)
IEEE Transactions on Neural Networks
On the discrete-time dynamics of the basic Hebbian neural network node
IEEE Transactions on Neural Networks
Links between PPCA and subspace methods for complete Gaussian density estimation
IEEE Transactions on Neural Networks
Invariant feature set generation with the linear manifold self-organizing map
ACCV'10 Proceedings of the 10th Asian conference on Computer vision - Volume Part IV
Hi-index | 0.01 |
This paper presents a neural model which learns low-dimensional nonlinear manifolds embedded in higher-dimensional data space based on mixtures of local linear manifolds under a self-organizing framework. Compared to other similar networks, the local linear manifolds learned by our network have a more localized representation of local data distributions thanks to a new distortion measure, which removes confusion between sub-models that exists in many similar mixture models. Each neuron in the network asymptotically learns a mean vector and a principal subspace of the data in its local region. It is proved that there is no local extremum for each sub-model. Experiments show that the new mixture model is better adapted to nonlinear manifolds of various data distributions than other similar models. The online-learning property of this model is desirable when the data set is very large, when computational efficiency is of paramount importance, or when data are sequentially input. We further show an application of this model to recognition of handwritten digit images based on mixtures of local linear manifolds.