Neighborhood selection and eigenvalues for embedding data complex in low dimension
ACIIDS'12 Proceedings of the 4th Asian conference on Intelligent Information and Database Systems - Volume Part I
About eigenvalues from embedding data complex in low dimension
ICSI'12 Proceedings of the Third international conference on Advances in Swarm Intelligence - Volume Part II
Self-content super-resolution for ultra-HD up-sampling
Proceedings of the 9th European Conference on Visual Media Production
Hi-index | 0.00 |
Advances in nonlinear dimensionality reduction provide a way to understand and visualize the underlying structure of complex data sets. The performance of large-scale nonlinear dimensionality reduction is of key importance in data mining, machine learning, and data analysis. In this paper, we concentrate on improving the performance of nonlinear dimensionality reduction using large-scale data sets on the GPU. In particular, we focus on solving problems including k nearest neighbor (KNN) search and sparse spectral decomposition for large-scale data, and propose an efficient framework for Local Linear Embedding (LLE). We implement a k-d tree based KNN algorithm and Krylov subspace method on the GPU to accelerate the nonlinear dimensionality reduction for large-scale data. Our results enable GPU-based k-d tree LLE processes of up to about 30-60 X faster compared to the brute force KNN [10] LLE model on the CPU. Overall, our methods save O (n2-6n-2k-3) memory space.