The Entire Regularization Path for the Support Vector Machine
The Journal of Machine Learning Research
Sparse eigen methods by D.C. programming
Proceedings of the 24th international conference on Machine learning
Sparse principal component analysis via regularized low rank matrix approximation
Journal of Multivariate Analysis
Expectation-maximization for sparse and non-negative PCA
Proceedings of the 25th international conference on Machine learning
Optimal Solutions for Sparse Principal Component Analysis
The Journal of Machine Learning Research
Generalized Power Method for Sparse Principal Component Analysis
The Journal of Machine Learning Research
Improve robustness of sparse PCA by L1-norm maximization
Pattern Recognition
Neural Processing Letters
Informative feature selection for object recognition via Sparse PCA
ICCV '11 Proceedings of the 2011 International Conference on Computer Vision
Consistency of sparse PCA in High Dimension, Low Sample Size contexts
Journal of Multivariate Analysis
ADMIRE: Anomaly detection method using entropy-based PCA with three-step sketches
Computer Communications
Hi-index | 0.00 |
In this paper we derive an algorithm to follow the entire solution path of the sparse principal component analysis (PCA) problem. The core idea is to iteratively identify the pairwise variables along which the objective function of the sparse PCA model can be largely increased, and then incrementally update the coefficients of the two variables so selected by a small stepsize. The new algorithm dominates on its capability of providing a computational shortcut to attain the entire spectrum of solutions of the sparse PCA problem, which is always beneficial to real applications. The proposed algorithm is simple and easy to be implemented. The effectiveness of our algorithm is empirically verified by a series of experiments implemented on synthetic and real problems, as compared with other typical sparse PCA methods.