Introduction to statistical pattern recognition (2nd ed.)
Introduction to statistical pattern recognition (2nd ed.)
A training algorithm for optimal margin classifiers
COLT '92 Proceedings of the fifth annual workshop on Computational learning theory
Nonlinear component analysis as a kernel eigenvalue problem
Neural Computation
Multiclass Linear Dimension Reduction by Weighted Pairwise Fisher Criteria
IEEE Transactions on Pattern Analysis and Machine Intelligence
Nonparametric discriminant analysis and nearest neighbor classification
Pattern Recognition Letters
Kernel Methods for Pattern Analysis
Kernel Methods for Pattern Analysis
SIAM Journal on Matrix Analysis and Applications
The Genetic Kernel Support Vector Machine: Description and Evaluation
Artificial Intelligence Review
Nonparametric Discriminant Analysis for Face Recognition
IEEE Transactions on Pattern Analysis and Machine Intelligence
Kernel Optimization in Discriminant Analysis
IEEE Transactions on Pattern Analysis and Machine Intelligence
Hi-index | 0.10 |
Kernel mapping has attracted a great deal of attention from researchers in the field of pattern recognition and statistical machine learning. Kernel-based approaches are the better choice whenever a non-linear classification model is needed. This paper proposes a nonlinear classification approach based on the non-parametric version of Fisher's discriminant analysis. This technique can efficiently find a nonparametric kernel representation where linear discriminants perform better. Data classification is achieved by integrating the linear version of the nonparametric Fisher's discriminant analysis with the kernel mapping. Based on the kernel trick, we provide a new formulation for Fisher's criterion, defined solely in terms of the inner dot-product of the original input data. The obtained experimental results have demonstrated the competitiveness of our approach compared to major state of the art approaches.