Support vector density estimation
Advances in kernel methods
Feature Selection via Concave Minimization and Support Vector Machines
ICML '98 Proceedings of the Fifteenth International Conference on Machine Learning
Use of the zero norm with linear models and kernel methods
The Journal of Machine Learning Research
The disputed federalist papers: SVM feature selection via concave minimization
Proceedings of the 2003 conference on Diversity in computing
Review: Educational data mining: A survey and a data mining-based analysis of recent works
Expert Systems with Applications: An International Journal
Hi-index | 0.00 |
L2 and L1 constrained regression methods, such as ridge regression and Lasso, have been generally known for their fitting ability. Recently, L0- constrained classifications have been used for feature selection and classifier construction. This paper proposes an L0 constrained regression method, which aims to minimize both the epsilon-insensitive fitting errors and L0 constraints on regression coefficients. Our L0-constrained regression can be efficiently approximated by successive linearization algorithm, and shows the favorable properties of selecting a compact set of fitting coefficients and tolerating small fitting errors. To make our L0 constrained regression generally applicable, the extension to nonlinear regression is also addressed in this paper.