Noise reduction and edge detection via kernel anisotropic diffusion
Pattern Recognition Letters
Global Convergence Analysis of Decomposition Methods for Support Vector Regression
ISNN '08 Proceedings of the 5th international symposium on Neural Networks: Advances in Neural Networks
Expert Systems with Applications: An International Journal
An effective incremental algorithm for ν-support vector machine
IITA'09 Proceedings of the 3rd international conference on Intelligent information technology application
The application of support vector regression in the dual-axis tilt sensor modeling
LSMS/ICSEE'10 Proceedings of the 2010 international conference on Life system modeling and simulation and intelligent computing, and 2010 international conference on Intelligent computing for sustainable energy and environment: Part III
Expert Systems with Applications: An International Journal
ICONIP'11 Proceedings of the 18th international conference on Neural Information Processing - Volume Part II
Expert Systems with Applications: An International Journal
Global convergence of modified multiplicative updates for nonnegative matrix factorization
Computational Optimization and Applications
Hi-index | 0.01 |
Decomposition methods are well-known techniques for solving quadratic programming (QP) problems arising in support vector machines (SVMs). In each iteration of a decomposition method, a small number of variables are selected and a QP problem with only the selected variables is solved. Since large matrix computations are not required, decomposition methods are applicable to large QP problems. In this paper, we will make a rigorous analysis of the global convergence of general decomposition methods for SVMs. We first introduce a relaxed version of the optimality condition for the QP problems and then prove that a decomposition method reaches a solution satisfying this relaxed optimality condition within a finite number of iterations under a very mild condition on how to select variables