Fast training of support vector machines using sequential minimal optimization
Advances in kernel methods
No free lunch for early stopping
Neural Computation
Hi-index | 0.01 |
This paper proposes the use of statistical criteria for early-stopping support vector machines, both for regression and classification problems. The method basically stops the minimization of the primal functional when moments of the error signal (up to fourth order) become stationary, rather than according to a tolerance threshold of primal convergence itself. This simple strategy induces lower computational efforts and no significant differences are observed in terms of performance and sparsity.