A training algorithm for optimal margin classifiers
COLT '92 Proceedings of the fifth annual workshop on Computational learning theory
The nature of statistical learning theory
The nature of statistical learning theory
Eigenfaces vs. Fisherfaces: Recognition Using Class Specific Linear Projection
IEEE Transactions on Pattern Analysis and Machine Intelligence
Nonlinear component analysis as a kernel eigenvalue problem
Neural Computation
Kernel independent component analysis
The Journal of Machine Learning Research
Optimal Linear Representations of Images for Object Recognition
IEEE Transactions on Pattern Analysis and Machine Intelligence
Kernel Optimal Component Analysis
CVPRW '04 Proceedings of the 2004 Conference on Computer Vision and Pattern Recognition Workshop (CVPRW'04) Volume 6 - Volume 06
Generalized Discriminant Analysis Using a Kernel Approach
Neural Computation
Tools for application-driven linear dimension reduction
Neurocomputing
Fast and robust fixed-point algorithms for independent component analysis
IEEE Transactions on Neural Networks
Two-stage optimal component analysis
Computer Vision and Image Understanding
Hi-index | 0.00 |
Optimal Component Analysis (OCA) is a linear subspace technique for dimensionality reduction designed to optimize object classification and recognition performance. The linear nature of OCA often limits recognition performance, if the underlying data structure is nonlinear or cluster structures are complex. To address these problems, we investigate a kernel analogue of OCA, which consists of applying OCA techniques to the data after it has been mapped nonlinearly into a new feature space, typically a high (possibly infinite) dimensional Hilbert space. In this paper, we study both the theoretical and algorithmic aspects of the problem and report results obtained in several object recognition experiments.