The computational complexity of differential and integral equations: an information-based approach
The computational complexity of differential and integral equations: an information-based approach
Numerical recipes in C (2nd ed.): the art of scientific computing
Numerical recipes in C (2nd ed.): the art of scientific computing
Topology representing networks
Neural Networks
Self-organizing maps
Nonlinear component analysis as a kernel eigenvalue problem
Neural Computation
Normalized Cuts and Image Segmentation
IEEE Transactions on Pattern Analysis and Machine Intelligence
Neural Networks for Pattern Recognition
Neural Networks for Pattern Recognition
Nonlinear Projection with the Isotop Method
ICANN '02 Proceedings of the International Conference on Artificial Neural Networks
Non-linear dimensionality reduction techniques for classification and visualization
Proceedings of the eighth ACM SIGKDD international conference on Knowledge discovery and data mining
Laplacian Eigenmaps for dimensionality reduction and data representation
Neural Computation
GPCA: an efficient dimension reduction scheme for image compression and retrieval
Proceedings of the tenth ACM SIGKDD international conference on Knowledge discovery and data mining
Learning a kernel matrix for nonlinear dimensionality reduction
ICML '04 Proceedings of the twenty-first international conference on Machine learning
Principal Manifolds and Nonlinear Dimensionality Reduction via Tangent Space Alignment
SIAM Journal on Scientific Computing
Unsupervised, Information-Theoretic, Adaptive Image Filtering for Image Restoration
IEEE Transactions on Pattern Analysis and Machine Intelligence
Learning Eigenfunctions Links Spectral Embedding and Kernel PCA
Neural Computation
Riemannian manifold learning for nonlinear dimensionality reduction
ECCV'06 Proceedings of the 9th European conference on Computer Vision - Volume Part I
Manifold Learning: The Price of Normalization
The Journal of Machine Learning Research
Nonlinear dimensionality reduction by locally linear inlaying
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
Non-linear dimensionality reduction of noisy data is a challenging problem encountered in a variety of data analysis applications. Recent results in the literature show that spectral decomposition, as used for example by the Laplacian Eigenmaps algorithm, provides a powerful tool for non-linear dimensionality reduction and manifold learning. In this paper, we discuss a significant shortcoming of these approaches, which we refer to as the repeated eigendirections problem. We propose a novel approach that combines successive 1-dimensional spectral embeddings with a data advection scheme that allows us to address this problem. The proposed method does not depend on a non-linear optimization scheme; hence, it is not prone to local minima. Experiments with artificial and real data illustrate the advantages of the proposed method over existing approaches. We also demonstrate that the approach is capable of correctly learning manifolds corrupted by significant amounts of noise.