Nonlinear component analysis as a kernel eigenvalue problem
Neural Computation
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
Laplacian Eigenmaps for dimensionality reduction and data representation
Neural Computation
ICCV '03 Proceedings of the Ninth IEEE International Conference on Computer Vision - Volume 2
A survey of kernels for structured data
ACM SIGKDD Explorations Newsletter
Principal Manifolds and Nonlinear Dimensionality Reduction via Tangent Space Alignment
SIAM Journal on Scientific Computing
Generalized Discriminant Analysis Using a Kernel Approach
Neural Computation
Visualization of Non-vectorial Data Using Twin Kernel Embedding
AIDM '06 Proceedings of the International Workshop on on Integrating AI and Data Mining
Probabilistic Non-linear Principal Component Analysis with Gaussian Process Latent Variable Models
The Journal of Machine Learning Research
A structural alignment kernel for protein structures
Bioinformatics
Hi-index | 0.00 |
In this paper, we proposed a new nonlinear dimensionality reduction algorithm called regularized Kernel Local Linear Embedding (rKLLE) for highly structured data. It is built on the original LLE by introducing kernel alignment type of constraint to effectively reduce the solution space and find out the embeddings reflecting the prior knowledge. To enable the non-vectorial data applicability of the algorithm, a kernelized LLE is used to get the reconstruction weights. Our experiments on typical non-vectorial data show that rKLLE greatly improves the results of KLLE.