Nonlinear component analysis as a kernel eigenvalue problem
Neural Computation
Laplacian Eigenmaps for dimensionality reduction and data representation
Neural Computation
A kernel view of the dimensionality reduction of manifolds
ICML '04 Proceedings of the twenty-first international conference on Machine learning
ADMA '07 Proceedings of the 3rd international conference on Advanced Data Mining and Applications
Mixture of the robust L1 distributions and its applications
AI'07 Proceedings of the 20th Australian joint conference on Advances in artificial intelligence
Twin kernel embedding with relaxed constraints on dimensionality reduction for structured data
AI'07 Proceedings of the 20th Australian joint conference on Advances in artificial intelligence
Local feature based tensor kernel for image manifold learning
PAKDD'11 Proceedings of the 15th Pacific-Asia conference on Advances in knowledge discovery and data mining - Volume Part II
Dimensionality reduction via compressive sensing
Pattern Recognition Letters
Hi-index | 0.00 |
In this paper, we propose the Kernel Laplacian Eigenmaps for nonlinear dimensionality reduction. This method can be extended to any structured input beyond the usual vectorial data, enabling the visualization of a wider range of data in low dimension once suitable kernels are defined. Comparison with related methods based on MNIST handwritten digits data set supported the claim of our approach. In addition to nonlinear dimensionality reduction, this approach makes visualization and related applications on non-vectorial data possible.