Mapping a manifold of perceptual observations
NIPS '97 Proceedings of the 1997 conference on Advances in neural information processing systems 10
Kernel and Nonlinear Canonical Correlation Analysis
IJCNN '00 Proceedings of the IEEE-INNS-ENNS International Joint Conference on Neural Networks (IJCNN'00)-Volume 4 - Volume 4
Kernel partial least squares regression in reproducing kernel hilbert space
The Journal of Machine Learning Research
Feature extraction by non parametric mutual information maximization
The Journal of Machine Learning Research
Dimensionality Reduction for Supervised Learning with Reproducing Kernel Hilbert Spaces
The Journal of Machine Learning Research
A probabilistic framework for semi-supervised clustering
Proceedings of the tenth ACM SIGKDD international conference on Knowledge discovery and data mining
Integrating constraints and metric learning in semi-supervised clustering
ICML '04 Proceedings of the twenty-first international conference on Machine learning
Locally linear metric adaptation for semi-supervised clustering
ICML '04 Proceedings of the twenty-first international conference on Machine learning
A kernel view of the dimensionality reduction of manifolds
ICML '04 Proceedings of the twenty-first international conference on Machine learning
Regression on manifolds using kernel dimension reduction
Proceedings of the 24th international conference on Machine learning
A dependence maximization view of clustering
Proceedings of the 24th international conference on Machine learning
Supervised feature selection via dependence estimation
Proceedings of the 24th international conference on Machine learning
Gene selection via the BAHSIC family of algorithms
Bioinformatics
Distance metric learning vs. Fisher discriminant analysis
AAAI'08 Proceedings of the 23rd national conference on Artificial intelligence - Volume 2
Measuring statistical dependence with hilbert-schmidt norms
ALT'05 Proceedings of the 16th international conference on Algorithmic Learning Theory
Overview and recent advances in partial least squares
SLSFS'05 Proceedings of the 2005 international conference on Subspace, Latent Structure and Feature Selection
Exploring and analyzing documents with OLAP
Proceedings of the 5th Ph.D. workshop on Information and knowledge
Discriminative functional analysis of human movements
Pattern Recognition Letters
Shape classification by manifold learning in multiple observation spaces
Information Sciences: an International Journal
Supervised Distance Preserving Projections
Neural Processing Letters
Hi-index | 0.01 |
We propose ''supervised principal component analysis (supervised PCA)'', a generalization of PCA that is uniquely effective for regression and classification problems with high-dimensional input data. It works by estimating a sequence of principal components that have maximal dependence on the response variable. The proposed supervised PCA is solvable in closed-form, and has a dual formulation that significantly reduces the computational complexity of problems in which the number of predictors greatly exceeds the number of observations (such as DNA microarray experiments). Furthermore, we show how the algorithm can be kernelized, which makes it applicable to non-linear dimensionality reduction tasks. Experimental results on various visualization, classification and regression problems show significant improvement over other supervised approaches both in accuracy and computational efficiency.