Laplacian Eigenmaps for dimensionality reduction and data representation
Neural Computation
Think globally, fit locally: unsupervised learning of low dimensional manifolds
The Journal of Machine Learning Research
Principal Manifolds and Nonlinear Dimensionality Reduction via Tangent Space Alignment
SIAM Journal on Scientific Computing
Image Denoising through Locally Linear Embedding
CGIV '05 Proceedings of the International Conference on Computer Graphics, Imaging and Visualization
Unsupervised Learning of Image Manifolds by Semidefinite Programming
International Journal of Computer Vision
Robust locally linear embedding
Pattern Recognition
Rapid and brief communication: The LLE and a linear mapping
Pattern Recognition
Letters: ISOLLE: LLE with geodesic distance
Neurocomputing
Speech Visualization based on Locally Linear Embedding (LLE) for the Hearing Impaired
BMEI '08 Proceedings of the 2008 International Conference on BioMedical Engineering and Informatics - Volume 02
Manifold Learning: The Price of Normalization
The Journal of Machine Learning Research
Nonlinear Dimensionality Reduction
Nonlinear Dimensionality Reduction
Improved locally linear embedding through new distance computing
ISNN'06 Proceedings of the Third international conference on Advances in Neural Networks - Volume Part I
Enhancing Human Face Detection by Resampling Examples Through Manifolds
IEEE Transactions on Systems, Man, and Cybernetics, Part A: Systems and Humans
Automatic configuration of spectral dimensionality reduction methods
Pattern Recognition Letters
Locally linear embedding: a survey
Artificial Intelligence Review
Object removal and loss concealment using neighbor embedding methods
Image Communication
Hi-index | 0.00 |
The local linear embedding algorithm (LLE) is a non-linear dimension-reducing technique that is widely used for its computational simplicity and intuitive approach. LLE first linearly reconstructs each input point from its nearest neighbors and then preserves these neighborhood relations in a low-dimensional embedding. We show that the reconstruction weights computed by LLE capture the high -dimensional structure of the neighborhoods, and not the low -dimensional manifold structure. Consequently, the weight vectors are highly sensitive to noise. Moreover, this causes LLE to converge to a linear projection of the input, as opposed to its non-linear embedding goal. To resolve both of these problems, we propose to compute the weight vectors using a low-dimensional neighborhood representation. We call this technique LDR-LLE. We present numerical examples of the perturbation and linear projection problems, and of the improved outputs resulting from the low-dimensional neighborhood representation.