Laplacian Eigenmaps for dimensionality reduction and data representation
Neural Computation
Learning a kernel matrix for nonlinear dimensionality reduction
ICML '04 Proceedings of the twenty-first international conference on Machine learning
Principal Manifolds and Nonlinear Dimensionality Reduction via Tangent Space Alignment
SIAM Journal on Scientific Computing
Finding the Homology of Submanifolds with High Confidence from Random Samples
Discrete & Computational Geometry
Manifold Learning: The Price of Normalization
The Journal of Machine Learning Research
An introduction to nonlinear dimensionality reduction by maximum variance unfolding
AAAI'06 proceedings of the 21st national conference on Artificial intelligence - Volume 2
From graphs to manifolds – weak and strong pointwise consistency of graph laplacians
COLT'05 Proceedings of the 18th annual conference on Learning Theory
Towards a theoretical foundation for laplacian-based manifold methods
COLT'05 Proceedings of the 18th annual conference on Learning Theory
Hi-index | 0.00 |
Maximum Variance Unfolding is one of the main methods for (nonlinear) dimensionality reduction. We study its large sample limit, providing specific rates of convergence under standard assumptions. We find that it is consistent when the underlying submanifold is isometric to a convex subset, and we provide some simple examples where it fails to be consistent.