Nonlinear component analysis as a kernel eigenvalue problem
Neural Computation
Kernel partial least squares regression in reproducing kernel hilbert space
The Journal of Machine Learning Research
Learning a kernel matrix for nonlinear dimensionality reduction
ICML '04 Proceedings of the twenty-first international conference on Machine learning
Analysis and extension of spectral methods for nonlinear dimensionality reduction
ICML '05 Proceedings of the 22nd international conference on Machine learning
Generalized Discriminant Analysis Using a Kernel Approach
Neural Computation
Comparison of visualization methods for an atlas of gene expression data sets
Information Visualization
Nonlinear Dimensionality Reduction
Nonlinear Dimensionality Reduction
Nonlinear Dimension Reduction with Kernel Sliced Inverse Regression
IEEE Transactions on Knowledge and Data Engineering
Measuring statistical dependence with hilbert-schmidt norms
ALT'05 Proceedings of the 16th international conference on Algorithmic Learning Theory
Overview and recent advances in partial least squares
SLSFS'05 Proceedings of the 2005 international conference on Subspace, Latent Structure and Feature Selection
Hi-index | 0.00 |
In this work, we consider dimensionality reduction in supervised settings and, specifically, we focus on regression problems. A novel algorithm, the supervised distance preserving projection (SDPP), is proposed. The SDPP minimizes the difference between pairwise distances among projected input covariates and distances among responses locally. This minimization of distance differences leads to the effect that the local geometrical structure of the low-dimensional subspace retrieved by the SDPP mimics that of the response space. This, not only facilitates an efficient regressor design but it also uncovers useful information for visualization. The SDPP achieves this goal by learning a linear parametric mapping and, thus, it can easily handle out-of-sample data points. For nonlinear data, a kernelized version of the SDPP is also derived. In addition, an intuitive extension of the SDPP is proposed to deal with classification problems. The experimental evaluation on both synthetic and real-world data sets demonstrates the effectiveness of the SDPP, showing that it performs comparably or superiorly to state-of-the-art approaches.