The FERET Evaluation Methodology for Face-Recognition Algorithms
IEEE Transactions on Pattern Analysis and Machine Intelligence
Choosing Multiple Parameters for Support Vector Machines
Machine Learning
Learning the Kernel Matrix with Semi-Definite Programming
ICML '02 Proceedings of the Nineteenth International Conference on Machine Learning
The em algorithm for kernel matrix completion with auxiliary data
The Journal of Machine Learning Research
Learning the Kernel Matrix with Semidefinite Programming
The Journal of Machine Learning Research
ICML '04 Proceedings of the twenty-first international conference on Machine learning
A fast iterative algorithm for fisher discriminant using heterogeneous kernels
ICML '04 Proceedings of the twenty-first international conference on Machine learning
Multiple kernel learning, conic duality, and the SMO algorithm
ICML '04 Proceedings of the twenty-first international conference on Machine learning
Learning the Kernel Function via Regularization
The Journal of Machine Learning Research
Generalized Discriminant Analysis Using a Kernel Approach
Neural Computation
Face recognition using kernel direct discriminant analysis algorithms
IEEE Transactions on Neural Networks
Optimizing the kernel in the empirical feature space
IEEE Transactions on Neural Networks
Adaptive quasiconformal kernel discriminant analysis
Neurocomputing
Kernel Matrix Learning for One-Class Classification
ISNN '08 Proceedings of the 5th international symposium on Neural Networks: Advances in Neural Networks
Analysis of the distance between two classes for tuning SVM hyperparameters
IEEE Transactions on Neural Networks
Representation of a fisher criterion function in a kernel feature space
IEEE Transactions on Neural Networks
Information Sciences: an International Journal
Learning with infinitely many features
Machine Learning
Hi-index | 0.01 |
The advantage of a kernel method often depends critically on a proper choice of the kernel function. A promising approach is to learn the kernel from data automatically. In this paper, we propose a novel method for learning the kernel matrix based on maximizing a class separability criterion that is similar to those used by linear discriminant analysis (LDA) and kernel Fisher discriminant (KFD). It is interesting to note that optimizing this criterion function does not require inverting the possibly singular within-class scatter matrix which is a computational problem encountered by many LDA and KFD methods. We have conducted experiments on both synthetic data and real-world data from UCI and FERET, showing that our method consistently outperforms some previous kernel learning methods.