An equivalence between sparse approximation and support vector machines
Neural Computation
Combining support vector and mathematical programming methods for classification
Advances in kernel methods
A Tutorial on Support Vector Machines for Pattern Recognition
Data Mining and Knowledge Discovery
Linear Programming Boosting via Column Generation
Machine Learning
Row and column generation technique for a multistage cutting stock problem
Computers and Operations Research
Sparse Greedy Matrix Approximation for Machine Learning
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
Generalisation Error Bounds for Sparse Linear Classifiers
COLT '00 Proceedings of the Thirteenth Annual Conference on Computational Learning Theory
Branch-And-Price: Column Generation for Solving Huge Integer Programs
Operations Research
Reducing multiclass to binary: a unifying approach for margin classifiers
The Journal of Machine Learning Research
Dimensionality reduction via sparse support vector machines
The Journal of Machine Learning Research
Sparseness of support vector machines
The Journal of Machine Learning Research
A Feature Selection Newton Method for Support Vector Machine Classification
Computational Optimization and Applications
A Sparse Support Vector Machine Approach to Region-Based Image Categorization
CVPR '05 Proceedings of the 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05) - Volume 1 - Volume 01
Knowledge and Information Systems - Special Issue on Mining Low-Quality Data
Selected Topics in Column Generation
Operations Research
Exact 1-Norm Support Vector Machines Via Unconstrained Convex Differentiable Minimization
The Journal of Machine Learning Research
On the sparseness of 1-norm support vector machines
Neural Networks
Matching pursuits with time-frequency dictionaries
IEEE Transactions on Signal Processing
An overview of statistical learning theory
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
This paper deals with fast methods for training a 1-norm support vector machine (SVM). First, we define a specific class of linear programming with many sparse constraints, i.e., row-column sparse constraint linear programming (RCSC-LP). In nature, the 1-norm SVM is a sort of RCSC-LP. In order to construct subproblems for RCSC-LP and solve them, a family of row-column generation (RCG) methods is introduced. RCG methods belong to a category of decomposition techniques, and perform row and column generations in a parallel fashion. Specially, for the 1-norm SVM, the maximum size of subproblems of RCG is identical with the number of Support Vectors (SVs). We also introduce a semi-deleting rule for RCG methods and prove the convergence of RCG methods when using the semi-deleting rule. Experimental results on toy data and real-world datasets illustrate that it is efficient to use RCG to train the 1-norm SVM, especially in the case of small SVs.