The R*-tree: an efficient and robust access method for points and rectangles
SIGMOD '90 Proceedings of the 1990 ACM SIGMOD international conference on Management of data
Laplacian Eigenmaps for dimensionality reduction and data representation
Neural Computation
Selection of the optimal parameter value for the Isomap algorithm
Pattern Recognition Letters
Adaptive Neighborhood Select Based on Local Linearity for Nonlinear Dimensionality Reduction
ISICA '09 Proceedings of the 4th International Symposium on Advances in Computation and Intelligence
Dynamic Neighborhood Selection for Nonlinear Dimensionality Reduction
MDAI '09 Proceedings of the 6th International Conference on Modeling Decisions for Artificial Intelligence
Locally linear embedding: a survey
Artificial Intelligence Review
Hi-index | 0.00 |
Locally linear embedding is a popular manifold learning algorithm for nonlinear dimensionality reduction. However, the success of LLE depends greatly on an input parameter - neighborhood size, and it is still an open problem how to find the optimal value for it. This paper focuses on this parameter, proposes that it should be self-tuning according to local density not a uniform value for all the data as LLE does, and presents a new variant algorithm of LLE, which can effectively prune "short circuit" edges by performing spatial search on the R*-Tree built on the dataset. This pruning leads the original fixed neighborhood size to be a self-tuning value, thus makes our algorithm have more topologically stableness than LLE does. The experiments prove that our idea and method are correct.