A Simple Decomposition Method for Support Vector Machines
Machine Learning
Polynomial-Time Decomposition Algorithms for Support Vector Machines
Machine Learning
Dual Clustering: Integrating Data Clustering over Optimization and Constraint Domains
IEEE Transactions on Knowledge and Data Engineering
Pattern Recognition Letters - Special issue: In memoriam Azriel Rosenfeld
Training ν-Support Vector Classifiers: Theory and Algorithms
Neural Computation
Efficient optimization of support vector machine learning parameters for unbalanced datasets
Journal of Computational and Applied Mathematics
QP Algorithms with Guaranteed Accuracy and Run Time for Support Vector Machines
The Journal of Machine Learning Research
General Polynomial Time Decomposition Algorithms
The Journal of Machine Learning Research
AusDM '06 Proceedings of the fifth Australasian conference on Data mining and analystics - Volume 61
On the complexity of working set selection
Theoretical Computer Science
A comparison of reduced support vector machines
International Journal of Intelligent Systems Technologies and Applications
Expert Systems with Applications: An International Journal
HOS-based mode classification for infomobility framework
First International Workshop on Cognitive Wireless Networks
Developing a semantic-enable information retrieval mechanism
Expert Systems with Applications: An International Journal
Robust label propagation on multiple networks
IEEE Transactions on Neural Networks
A convergent hybrid decomposition algorithm model for SVM training
IEEE Transactions on Neural Networks
Generalized SMO-style decomposition algorithms
COLT'07 Proceedings of the 20th annual conference on Learning theory
Computational Optimization and Applications
Training support vector machines via SMO-type decomposition methods
ALT'05 Proceedings of the 16th international conference on Algorithmic Learning Theory
General polynomial time decomposition algorithms
COLT'05 Proceedings of the 18th annual conference on Learning Theory
VLSI design of an SVM learning core on sequential minimal optimization algorithm
IEEE Transactions on Very Large Scale Integration (VLSI) Systems
Hi-index | 0.01 |
The support vector machine (SVM) is a promising technique for pattern recognition. It requires the solution of a large dense quadratic programming problem. Traditional optimization methods cannot be directly applied due to memory restrictions. Up to now, very few methods can handle the memory problem and an important one is the “decomposition method.” However, there is no convergence proof so far. We connect this method to projected gradient methods and provide theoretical proofs for a version of decomposition methods. An extension to bound-constrained formulation of SVM is also provided. We then show that this convergence proof is valid for general decomposition methods if their working set selection meets a simple requirement