Introduction to statistical pattern recognition (2nd ed.)
Introduction to statistical pattern recognition (2nd ed.)
The nature of statistical learning theory
The nature of statistical learning theory
An introduction to support Vector Machines: and other kernel-based learning methods
An introduction to support Vector Machines: and other kernel-based learning methods
Pattern Recognition and Neural Networks
Pattern Recognition and Neural Networks
Choosing Multiple Parameters for Support Vector Machines
Machine Learning
Gender Classification of Human Faces
BMCV '02 Proceedings of the Second International Workshop on Biologically Motivated Computer Vision
Kernel Methods for Pattern Analysis
Kernel Methods for Pattern Analysis
Learning the Kernel Matrix with Semidefinite Programming
The Journal of Machine Learning Research
IEEE Transactions on Pattern Analysis and Machine Intelligence
Learning the Kernel with Hyperkernels
The Journal of Machine Learning Research
IEEE Transactions on Pattern Analysis and Machine Intelligence
Learning a kernel function for classification with small training samples
ICML '06 Proceedings of the 23rd international conference on Machine learning
A two-distribution compounded statistical model for Radar HRRP target recognition
IEEE Transactions on Signal Processing - Part I
Radar HRRP target recognition based on higher order spectra
IEEE Transactions on Signal Processing
Input space versus feature space in kernel-based methods
IEEE Transactions on Neural Networks
The evidence framework applied to support vector machines
IEEE Transactions on Neural Networks
Nonlinear kernel-based statistical pattern analysis
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
Optimizing the kernel in the empirical feature space
IEEE Transactions on Neural Networks
Letters: Kernel subclass discriminant analysis
Neurocomputing
Radar HRRP recognition based on discriminant information analysis
WSEAS Transactions on Information Science and Applications
Hi-index | 0.01 |
It is widely recognized that whether the selected kernel matches the data determines the performance of kernel-based methods. Ideally it is expected that the data is linearly separable in the kernel induced feature space, therefore, Fisher linear discriminant criterion can be used as a cost function to optimize the kernel function. However, the data may not be linearly separable even after kernel transformation in many applications, e.g., the data may exist as multimodally distributed structure, in this case, a nonlinear classifier is preferred, and obviously Fisher criterion is not a suitable choice as kernel optimization rule. Motivated by this issue, we propose a localized kernel Fisher criterion, instead of traditional Fisher criterion, as the kernel optimization rule to increase the local margins between embedded classes in kernel induced feature space. Experimental results based on some benchmark data and measured radar high-resolution range profile (HRRP) data show that the classification performance can be improved by using the proposed method.