Eigenfaces vs. Fisherfaces: Recognition Using Class Specific Linear Projection
IEEE Transactions on Pattern Analysis and Machine Intelligence
Convergence Properties of the Nelder--Mead Simplex Method in Low Dimensions
SIAM Journal on Optimization
Convergence of the Nelder--Mead Simplex Method to a Nonstationary Point
SIAM Journal on Optimization
A fast kernel-based nonlinear discriminant analysis for multi-class problems
Pattern Recognition
Optimising Kernel Parameters and Regularisation Coefficients for Non-linear Discriminant Analysis
The Journal of Machine Learning Research
An introduction to kernel-based learning algorithms
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
In this paper, we develop a novel approach to perform kernel parameter selection for Kernel Fisher discriminant analysis (KFDA) based on the viewpoint that optimal kernel parameter is associated with the maximum linear separability of samples in the feature space. This makes our approach for selecting kernel parameter of KFDA completely comply with the essence of KFDA. Indeed, this paper is the first paper to determine the kernel parameter of KFDA using a search algorithm. Our approach proposed in this paper firstly constructs an objective function whose minimum is exactly equivalent to the maximum of linear separability. Then the approach exploits a minimum search algorithm to determine the optimal kernel parameter of KFDA. The convergence properties of the search algorithm allow our approach to work well. The algorithm is also simple and not computationally complex. Experimental results illustrate the effectiveness of our approach.