Pattern Classification (2nd Edition)
Pattern Classification (2nd Edition)
Kernel Methods for Pattern Analysis
Kernel Methods for Pattern Analysis
Convex Optimization
A DC-programming algorithm for kernel selection
ICML '06 Proceedings of the 23rd international conference on Machine learning
Optimal kernel selection in Kernel Fisher discriminant analysis
ICML '06 Proceedings of the 23rd international conference on Machine learning
A criterion for optimizing kernel parameters in KBDA for image retrieval
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Optimizing the kernel in the empirical feature space
IEEE Transactions on Neural Networks
Hi-index | 0.01 |
In this brief, we consider kernel methods for classification (Shawe-Taylor and Cristianini, 2004) from a separability point of view and provide a representation of the Fisher criterion function in a kernel feature space. We then show that the value of the Fisher function can be simply computed by using averages of diagonal and off-diagonal blocks of a kernel matrix. This result further serves to reveal that the ideal kernel matrix is a global solution to the problem of maximizing the Fisher criterion function. Its relation to an empirical kernel target alignment is then reported. To demonstrate the usefulness of these theories, we provide an application study for classification of prostate cancer based on microarray data sets. The results show that the parameter of a kernel function can be readily optimized.