The FERET Evaluation Methodology for Face-Recognition Algorithms
IEEE Transactions on Pattern Analysis and Machine Intelligence
Discriminative Locality Alignment
ECCV '08 Proceedings of the 10th European Conference on Computer Vision: Part I
Geometric Mean for Subspace Selection
IEEE Transactions on Pattern Analysis and Machine Intelligence
Bregman Divergence-Based Regularization for Transfer Subspace Learning
IEEE Transactions on Knowledge and Data Engineering
Manifold elastic net: a unified framework for sparse dimension reduction
Data Mining and Knowledge Discovery
Hi-index | 0.00 |
In compressed sensing and statistical society, dozens of algorithms have been developed to solve l1 penalized least square regression, but constrained sparse quadratic optimization (SQO) is still an open problem. In this paper, we propose backward-forward least angle shrinkage (BF-LAS), which provides a scheme to solve general SQO including sparse eigenvalue minimization. BF-LAS starts from the dense solution, iteratively shrinks unimportant variables' magnitudes to zeros in the backward step for minimizing the l1 norm, decreases important variables' gradients in the forward step for optimizing the objective, and projects the solution on the feasible set defined by the constraints. The importance of a variable is measured by its correlation w.r.t the objective and is updated via least angle shrinkage (LAS). We show promising performance of BF-LAS on sparse dimension reduction.