Practical methods of optimization; (2nd ed.)
Practical methods of optimization; (2nd ed.)
Introduction to statistical pattern recognition (2nd ed.)
Introduction to statistical pattern recognition (2nd ed.)
The nature of statistical learning theory
The nature of statistical learning theory
Eigenfaces vs. Fisherfaces: Recognition Using Class Specific Linear Projection
IEEE Transactions on Pattern Analysis and Machine Intelligence
Nonlinear component analysis as a kernel eigenvalue problem
Neural Computation
The FERET Evaluation Methodology for Face-Recognition Algorithms
IEEE Transactions on Pattern Analysis and Machine Intelligence
Choosing Multiple Parameters for Support Vector Machines
Machine Learning
Kernel Methods for Pattern Analysis
Kernel Methods for Pattern Analysis
Convex Optimization
Learning the Kernel Matrix with Semidefinite Programming
The Journal of Machine Learning Research
In Defense of One-Vs-All Classification
The Journal of Machine Learning Research
Learning a kernel matrix for nonlinear dimensionality reduction
ICML '04 Proceedings of the twenty-first international conference on Machine learning
Where Are Linear Feature Extraction Methods Applicable?
IEEE Transactions on Pattern Analysis and Machine Intelligence
Gradient-Based Adaptation of General Gaussian Kernels
Neural Computation
Generalized Discriminant Analysis Using a Kernel Approach
Neural Computation
On solving the face recognition problem with one training sample per subject
Pattern Recognition
Journal of Cognitive Neuroscience
Selecting discriminant eigenfaces for face recognition
Pattern Recognition Letters
Input space versus feature space in kernel-based methods
IEEE Transactions on Neural Networks
Nonlinear kernel-based statistical pattern analysis
IEEE Transactions on Neural Networks
An introduction to kernel-based learning algorithms
IEEE Transactions on Neural Networks
Efficient tuning of SVM hyperparameters using radius/margin bound and iterative algorithms
IEEE Transactions on Neural Networks
Face recognition using kernel direct discriminant analysis algorithms
IEEE Transactions on Neural Networks
Optimizing the kernel in the empirical feature space
IEEE Transactions on Neural Networks
Evaluation measures for kernel optimization
Pattern Recognition Letters
Accelerated max-margin multiple kernel learning
Applied Intelligence
Learning with infinitely many features
Machine Learning
Robust novelty detection in the framework of a contamination neighbourhood
International Journal of Intelligent Information and Database Systems
Robust novelty detection in the framework of a contamination neighbourhood
International Journal of Intelligent Information and Database Systems
A Family of Discriminative Manifold Learning Algorithms and Their Application to Speech Recognition
IEEE/ACM Transactions on Audio, Speech and Language Processing (TASLP)
Hi-index | 0.01 |
This paper presents a novel algorithm to optimize the Gaussian kernel for pattern classification tasks, where it is desirable to have well-separated samples in the kernel feature space. We propose to optimize the Gaussian kernel parameters by maximizing a classical class separability criterion, and the problem is solved through a quasi-Newton algorithm by making use of a recently proposed decomposition of the objective criterion. The proposed method is evaluated on five data sets with two kernel-based learning algorithms. The experimental results indicate that it achieves the best overall classification performance, compared with three competing solutions. In particular, the proposed method provides a valuable kernel optimization solution in the severe small sample size scenario.