Learning translation invariant recognition in massively parallel networks
Volume I: Parallel architectures on PARLE: Parallel Architectures and Languages Europe
Bayesian methods for adaptive models
Bayesian methods for adaptive models
Machine Learning
Nonlinear component analysis as a kernel eigenvalue problem
Neural Computation
Introduction to support vector learning
Advances in kernel methods
Entropy numbers, operators and support vector kernels
Advances in kernel methods
Making large-scale support vector machine learning practical
Advances in kernel methods
Kernel PCA and de-noising in feature spaces
Proceedings of the 1998 conference on Advances in neural information processing systems II
The bias-variance tradeoff and the randomized GACV
Proceedings of the 1998 conference on Advances in neural information processing systems II
An introduction to support Vector Machines: and other kernel-based learning methods
An introduction to support Vector Machines: and other kernel-based learning methods
Choosing Multiple Parameters for Support Vector Machines
Machine Learning
Improved Pairwise Coupling Classification with Correcting Classifiers
ECML '98 Proceedings of the 10th European Conference on Machine Learning
Comparison of View-Based Object Recognition Algorithms Using Realistic 3D Models
ICANN 96 Proceedings of the 1996 International Conference on Artificial Neural Networks
Optimization of the SVM Kernels Using an Empirical Error Minimization Scheme
SVM '02 Proceedings of the First International Workshop on Pattern Recognition with Support Vector Machines
Training Support Vector Machines: an Application to Face Detection
CVPR '97 Proceedings of the 1997 Conference on Computer Vision and Pattern Recognition (CVPR '97)
KMOD " A Tw o-Parameter SVM Kernel for Pattern Recognition
ICPR '02 Proceedings of the 16 th International Conference on Pattern Recognition (ICPR'02) Volume 3 - Volume 3
Support Vector Machines: Training and Applications
Support Vector Machines: Training and Applications
Kernel Methods for Pattern Analysis
Kernel Methods for Pattern Analysis
Learning the Kernel Matrix with Semidefinite Programming
The Journal of Machine Learning Research
IEEE Transactions on Pattern Analysis and Machine Intelligence
IEEE Transactions on Pattern Analysis and Machine Intelligence
Generalized Discriminant Analysis Using a Kernel Approach
Neural Computation
The evidence framework applied to classification networks
Neural Computation
Backpropagation applied to handwritten zip code recognition
Neural Computation
On the eigenspectrum of the gram matrix and the generalization error of kernel-PCA
IEEE Transactions on Information Theory
Optimizing resources in model selection for support vector machine
Pattern Recognition
Locality preserving CCA with applications to data visualization and pose estimation
Image and Vision Computing
Short Communication: A geometric method for model selection in support vector machine
Expert Systems with Applications: An International Journal
A PSO-based framework for dynamic SVM model selection
Proceedings of the 11th Annual conference on Genetic and evolutionary computation
Model selection for the LS-SVM. Application to handwriting recognition
Pattern Recognition
Simultaneous tuning of hyperparameter and parameter for support vector machines
PAKDD'07 Proceedings of the 11th Pacific-Asia conference on Advances in knowledge discovery and data mining
Multi-objective uniform design as a SVM model selection tool for face recognition
Expert Systems with Applications: An International Journal
Help-Training for semi-supervised support vector machines
Pattern Recognition
PSO-Based hyper-parameters selection for LS-SVM classifiers
ICONIP'06 Proceedings of the 13th international conference on Neural Information Processing - Volume Part II
A dynamic model selection strategy for support vector machine classifiers
Applied Soft Computing
Computers and Electronics in Agriculture
Efficient and effective realtime prediction of drive-by download attacks
Journal of Network and Computer Applications
Hi-index | 0.01 |
This approach aims to optimize the kernel parameters and to efficiently reduce the number of support vectors, so that the generalization error can be reduced drastically. The proposed methodology suggests the use of a new model selection criterion based on the estimation of the probability of error of the SVM classifier. For comparison, we considered two more model selection criteria: GACV ('Generalized Approximate Cross-Validation') and VC ('Vapnik-Chernovenkis') dimension. These criteria are algebraic estimates of upper bounds of the expected error. For the former, we also propose a new minimization scheme. The experiments conducted on a bi-class problem show that we can adequately choose the SVM hyper-parameters using the empirical error criterion. Moreover, it turns out that the criterion produces a less complex model with fewer support vectors. For multi-class data, the optimization strategy is adapted to the one-against-one data partitioning. The approach is then evaluated on images of handwritten digits from the USPS database.