Neural Computation
Normalized Cuts and Image Segmentation
IEEE Transactions on Pattern Analysis and Machine Intelligence
Laplacian Eigenmaps for dimensionality reduction and data representation
Neural Computation
Principal Manifolds and Nonlinear Dimensionality Reduction via Tangent Space Alignment
SIAM Journal on Scientific Computing
Unsupervised Learning of Image Manifolds by Semidefinite Programming
International Journal of Computer Vision
Proceedings of the 13th ACM SIGKDD international conference on Knowledge discovery and data mining
Towards a theoretical foundation for laplacian-based manifold methods
COLT'05 Proceedings of the 18th annual conference on Learning Theory
Hi-index | 0.00 |
Nonlinear Dimensionality Reduction is an important issue in many machine learning areas where essentially low-dimensional data is nonlinearly embedded in some high-dimensional space. In this paper, we show that the existing Laplacian Eigenmaps method suffers from the distortion problem, and propose a new distortion-free dimensionality reduction method by adopting a local linear model to encode the local information. We introduce a new loss function that can be seen as a different way to construct the Laplacian matrix, and a new way to impose scaling constraints under the local linear model. Better low-dimensional embeddings are obtained via constrained concave convex procedure. Empirical studies and real-world applications have shown the effectiveness of our method.