A decision-theoretic generalization of on-line learning and an application to boosting
Journal of Computer and System Sciences - Special issue: 26th annual ACM symposium on the theory of computing & STOC'94, May 23–25, 1994, and second annual Europe an conference on computational learning theory (EuroCOLT'95), March 13–15, 1995
Nonlinear component analysis as a kernel eigenvalue problem
Neural Computation
Fractional-Step Dimensionality Reduction
IEEE Transactions on Pattern Analysis and Machine Intelligence
Discriminative Common Vectors for Face Recognition
IEEE Transactions on Pattern Analysis and Machine Intelligence
Using Uncorrelated Discriminant Analysis for Tissue Classification with Gene Expression Data
IEEE/ACM Transactions on Computational Biology and Bioinformatics (TCBB)
Neighborhood Preserving Embedding
ICCV '05 Proceedings of the Tenth IEEE International Conference on Computer Vision - Volume 2
Generalized Discriminant Analysis Using a Kernel Approach
Neural Computation
Boosted discriminant projections for nearest neighbor classification
Pattern Recognition
Linear dimensionality reduction using relevance weighted LDA
Pattern Recognition
Face recognition using kernel direct discriminant analysis algorithms
IEEE Transactions on Neural Networks
Foley-Sammon optimal discriminant vectors using kernel approach
IEEE Transactions on Neural Networks
A Learning Algorithm of Boosting Kernel Discriminant Analysis for Pattern Recognition
IEICE - Transactions on Information and Systems
Boosting constrained mutual subspace method for robust image-set based object recognition
IJCAI'09 Proceedings of the 21st international jont conference on Artifical intelligence
Improved kernel common vector method for face recognition varying in background conditions
CompIMAGE'10 Proceedings of the Second international conference on Computational Modeling of Objects Represented in Images
Hi-index | 0.00 |
Kernel discriminant analysis (KDA) is one of the most effective nonlinear techniques for dimensionality reduction and feature extraction. It can be applied to a wide range of applications involving high-dimensional data, including images, gene expressions, and text data. This paper develops a new algorithm to further improve the overall performance of KDA by effectively integrating the boosting and KDA techniques. The proposed method, called boosting kernel discriminant analysis (BKDA), possesses several appealing properties. First, like all kernel methods, it handles nonlinearity in a disciplined manner that is also computationally attractive; second, by introducing pairwise class discriminant information into the discriminant criterion and simultaneously employing boosting to robustly adjust the information, it further improves the classification accuracy; third, by calculating the significant discriminant information in the null space of the within-class scatter operator, it also effectively deals with the small sample size problem which is widely encountered in real-world applications for KDA; fourth, by taking advantage of the boosting and KDA techniques, it constitutes a strong ensemble-based KDA framework. Experimental results on gene expression data demonstrate the promising performance of the proposed methodology.