Small worlds: the dynamics of networks between order and randomness
Small worlds: the dynamics of networks between order and randomness
Laplacian Eigenmaps for dimensionality reduction and data representation
Neural Computation
A kernel view of the dimensionality reduction of manifolds
ICML '04 Proceedings of the twenty-first international conference on Machine learning
Principal Manifolds and Nonlinear Dimensionality Reduction via Tangent Space Alignment
SIAM Journal on Scientific Computing
ISICA '08 Proceedings of the 3rd International Symposium on Advances in Computation and Intelligence
Supervised subspace projections for constructing ensembles of classifiers
Information Sciences: an International Journal
An extended ISOMAP by enhancing similarity for clustering
IEA/AIE'12 Proceedings of the 25th international conference on Industrial Engineering and Other Applications of Applied Intelligent Systems: advanced research in applied artificial intelligence
Hi-index | 0.00 |
Manifold learning can perform nonlinear dimensionality reduction in the high dimensional space. ISOMAP, LLE, Laplacian Eigenmaps, LTSA and Multilayer autoencoders are representative algorithms. Most of them are only defined on the training sets and are executed as a batch mode. They don't provide a model or a formula to map the new data into the low dimensional space. In this paper, we proposed an incremental manifold learning algorithm based on the small world model, which generalizes ISOMAP to new samples. At first, k nearest neighbors and some faraway points are selected from the training set for each new sample. Then the low dimensional embedding of the new sample is obtained by preserving the geodesic distances between it and those points. Experiments demonstrate that new samples can effectively be projected into the low dimensional space with the presented method and the algorithm has lower complexity.