Atomic Decomposition by Basis Pursuit
SIAM Review
Feature Selection via Concave Minimization and Support Vector Machines
ICML '98 Proceedings of the Fifteenth International Conference on Machine Learning
A robust nonlinear identification algorithm using PRESS statistic and forward regression
IEEE Transactions on Neural Networks
A Predual Proximal Point Algorithm Solving a Non Negative Basis Pursuit Denoising Model
International Journal of Computer Vision
Hi-index | 0.00 |
Feature selection is a fundamental process in many classifier design problems. However, it is NP-complete and approximate approaches often require requires extensive exploration and evaluation. This paper describes a novel approach that represents feature selection as a continuous regularization problem which has a single, global minimum, where the model's complexity is measured using a 1-norm on the parameter vector. A new exploratory design process is also described that allows the designer to efficiently construct the complete locus of sparse, kernel-based classifiers. It allows the designer to investigate the optimal parameters' trajectories as the regularization parameter is altered and look for effects, such as Simpson's paradox, that occur in many multivariate data analysis problems. The approach is demonstrated on the well-known Australian Credit data set.