Neural Computation
The nature of statistical learning theory
The nature of statistical learning theory
Machine Learning
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
Sparse bayesian learning and the relevance vector machine
The Journal of Machine Learning Research
Sparse modeling using orthogonal forward regression with PRESS statistic and regularization
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
RBF neural network center selection based on Fisher ratio class separability measure
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
Validation-based sparse gaussian process classifier design
Neural Computation
Sparse kernel modelling: a unified approach
IDEAL'07 Proceedings of the 8th international conference on Intelligent data engineering and automated learning
Grey-box radial basis function modelling
Neurocomputing
Hi-index | 0.00 |
We propose a simple yet computationally efficient construction algorithm for two-class kernel classifiers. In order to optimise classifier’s generalisation capability, an orthogonal forward selection procedure is used to select kernels one by one by minimising the leave-one-out (LOO) misclassification rate directly. It is shown that the computation of the LOO misclassification rate is very efficient owing to orthogonalisation. Examples are used to demonstrate that the proposed algorithm is a viable alternative to construct sparse two-class kernel classifiers in terms of performance and computational efficiency.