Simulated annealing and Boltzmann machines: a stochastic approach to combinatorial optimization and neural computing
Fast training of support vector machines using sequential minimal optimization
Advances in kernel methods
An introduction to support Vector Machines: and other kernel-based learning methods
An introduction to support Vector Machines: and other kernel-based learning methods
AI Game Programming Wisdom
Advances in Large Margin Classifiers
Advances in Large Margin Classifiers
Training v-support vector regression: theory and algorithms
Neural Computation
Feature Selection via Concave Minimization and Support Vector Machines
ICML '98 Proceedings of the Fifteenth International Conference on Machine Learning
Exact simplification of support vector solutions
The Journal of Machine Learning Research
The Journal of Machine Learning Research
The Cross Entropy Method: A Unified Approach To Combinatorial Optimization, Monte-carlo Simulation (Information Science and Statistics)
Improvements to Platt's SMO Algorithm for SVM Classifier Design
Neural Computation
Application of the Cross-Entropy Method to Dual Lagrange Support Vector Machine
ADMA '09 Proceedings of the 5th International Conference on Advanced Data Mining and Applications
Information estimators for weighted observations
Neural Networks
Hi-index | 0.00 |
We consider support vector machines for binary classification. As opposed to most approaches we use the number of support vectors (the "L0 norm") as a regularizing term instead of the L1 or L2 norms. In order to solve the optimization problem we use the cross entropy method to search over the possible sets of support vectors. The algorithm consists of solving a sequence of efficient linear programs. We report experiments where our method produces generalization errors that are similar to support vector machines, while using a considerably smaller number of support vectors.