Laplacian Eigenmaps for dimensionality reduction and data representation
Neural Computation
Sufficient dimensionality reduction
The Journal of Machine Learning Research
Dimensionality Reduction for Supervised Learning with Reproducing Kernel Hilbert Spaces
The Journal of Machine Learning Research
A kernel view of the dimensionality reduction of manifolds
ICML '04 Proceedings of the twenty-first international conference on Machine learning
Supervised dimensionality reduction using mixture models
ICML '05 Proceedings of the 22nd international conference on Machine learning
Analysis and extension of spectral methods for nonlinear dimensionality reduction
ICML '05 Proceedings of the 22nd international conference on Machine learning
Semi-supervised nonlinear dimensionality reduction
ICML '06 Proceedings of the 23rd international conference on Machine learning
Manifold Regularization: A Geometric Framework for Learning from Labeled and Unlabeled Examples
The Journal of Machine Learning Research
Human Age Estimation by Metric Learning for Regression Problems
EMMCVPR '09 Proceedings of the 7th International Conference on Energy Minimization Methods in Computer Vision and Pattern Recognition
MM '09 Proceedings of the 17th ACM international conference on Multimedia
A volume-based heat-diffusion classifier
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Learning 3-D object orientation from images
ICRA'09 Proceedings of the 2009 IEEE international conference on Robotics and Automation
Dimension reduction for regression with bottleneck neural networks
IDEAL'10 Proceedings of the 11th international conference on Intelligent data engineering and automated learning
Robust head pose estimation using supervised manifold learning
ECCV'10 Proceedings of the 11th European conference on Computer vision: Part VI
Hi-index | 0.00 |
We study the problem of discovering a manifold that best preserves information relevant to a nonlinear regression. Solving this problem involves extending and uniting two threads of research. On the one hand, the literature on sufficient dimension reduction has focused on methods for finding the best linear subspace for nonlinear regression; we extend this to manifolds. On the other hand, the literature on manifold learning has focused on unsupervised dimensionality reduction; we extend this to the supervised setting. Our approach to solving the problem involves combining the machinery of kernel dimension reduction with Laplacian eigenmaps. Specifically, we optimize cross-covariance operators in kernel feature spaces that are induced by the normalized graph Laplacian. The result is a highly flexible method in which no strong assumptions are made on the regression function or on the distribution of the covariates. We illustrate our methodology on the analysis of global temperature data and image manifolds.