Practical methods of optimization; (2nd ed.)
Practical methods of optimization; (2nd ed.)
The algorithmic analysis of hybrid systems
Theoretical Computer Science - Special issue on hybrid systems
Solving the quadratic programming problem arising in support vector classification
Advances in kernel methods
Fast training of support vector machines using sequential minimal optimization
Advances in kernel methods
Intelligent Hybrid Systems
Training Support Vector Machines: an Application to Face Detection
CVPR '97 Proceedings of the 1997 Conference on Computer Vision and Pattern Recognition (CVPR '97)
A fast projected conjugate gradient algorithm for training support vector machines
Contemporary mathematics
Pattern Recognition and Machine Learning (Information Science and Statistics)
Pattern Recognition and Machine Learning (Information Science and Statistics)
ICANN'05 Proceedings of the 15th international conference on Artificial neural networks: formal models and their applications - Volume Part II
Hi-index | 0.00 |
This paper presents a strategy to optimize the learning phase of the Support Vector Machines algorithm (SVM). The SVM algorithm is widely used in solving different tasks like classification, regression, density estimation and clustering problems. However, the algorithm presents important disadvantages when learning large scale problems. Training a SVM involves finding the solution of a quadratic optimization problem (QP), which is very resource consuming. What is more, during the learning step, the best working set must be selected, which is a hard to perform task. In this work, we combine a heuristic approach, which selects the best working set data, with a projected conjugate gradient method, which is a fast and easy to implement algorithm that solves the quadratic programming problem involved in the SVM algorithm. We compare the performances of the optimization strategies using some well-known benchmark databases.