Feature Selection via Concave Minimization and Support Vector Machines
ICML '98 Proceedings of the Fifteenth International Conference on Machine Learning
Dimensionality reduction via sparse support vector machines
The Journal of Machine Learning Research
Variable selection using svm based criteria
The Journal of Machine Learning Research
Use of the zero norm with linear models and kernel methods
The Journal of Machine Learning Research
A Sparse Support Vector Machine Approach to Region-Based Image Categorization
CVPR '05 Proceedings of the 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05) - Volume 1 - Volume 01
Feature selection and dualities in maximum entropy discrimination
UAI'00 Proceedings of the Sixteenth conference on Uncertainty in artificial intelligence
Foundations and Trends® in Machine Learning
Optimization with Sparsity-Inducing Penalties
Foundations and Trends® in Machine Learning
Hi-index | 0.01 |
Support vector machine (SVM) is the state-of-the-art classification method, and the doubly regularized SVM (DrSVM) is an important extension based on the elastic net penalty. DrSVM has been successfully applied in handling variable selection while retaining (or discarding) correlated variables. However, it is challenging to solve this model. In this paper we develop an iterative @?"2-SVM approach to implement DrSVM over high-dimensional datasets. Our approach can significantly reduce the computation complexity. Moreover, the corresponding algorithms have global convergence property. Empirical results over the simulated and real-world gene datasets are encouraging.