Nonlinear component analysis as a kernel eigenvalue problem
Neural Computation
Mixtures of probabilistic principal component analyzers
Neural Computation
Learning Lie groups for invariant visual perception
Proceedings of the 1998 conference on Advances in neural information processing systems II
Efficient Pattern Recognition Using a New Transformation Distance
Advances in Neural Information Processing Systems 5, [NIPS Conference]
Think globally, fit locally: unsupervised learning of low dimensional manifolds
The Journal of Machine Learning Research
Learning Eigenfunctions Links Spectral Embedding and Kernel PCA
Neural Computation
Hi-index | 0.00 |
We claim and present arguments to the effect that a large class of manifold learning algorithms that are essentially local and can be framed as kernel learning algorithms will suffer from the curse of dimensionality, at the dimension of the true underlying manifold. This observation invites an exploration of nonlocal manifold learning algorithms that attempt to discover shared structure in the tangent planes at different positions. A training criterion for such an algorithm is proposed, and experiments estimating a tangent plane prediction function are presented, showing its advantages with respect to local manifold learning algorithms: it is able to generalize very far from training data on learning handwritten character image rotations, where local nonparametric methods fail.