Elements of information theory
Elements of information theory
Efficient adaptive density estimation per image pixel for the task of background subtraction
Pattern Recognition Letters
Consistency of the Group Lasso and Multiple Kernel Learning
The Journal of Machine Learning Research
Optimal Solutions for Sparse Principal Component Analysis
The Journal of Machine Learning Research
Convex multi-task feature learning
Machine Learning
Group lasso with overlap and graph lasso
ICML '09 Proceedings of the 26th Annual International Conference on Machine Learning
IEEE Transactions on Signal Processing
Exploiting structure in wavelet-based Bayesian compressive sensing
IEEE Transactions on Signal Processing
Model-based compressive sensing
IEEE Transactions on Information Theory
An Empirical Bayesian Strategy for Solving the Simultaneous Sparse Approximation Problem
IEEE Transactions on Signal Processing - Part II
A Theory for Sampling Signals From a Union of Subspaces
IEEE Transactions on Signal Processing
Decoding by linear programming
IEEE Transactions on Information Theory
IEEE Transactions on Information Theory
Signal Reconstruction From Noisy Random Projections
IEEE Transactions on Information Theory
Signal Recovery From Random Measurements Via Orthogonal Matching Pursuit
IEEE Transactions on Information Theory
Adaptive Forward-Backward Greedy Algorithm for Learning Sparse Representations
IEEE Transactions on Information Theory
Sparse methods for biomedical data
ACM SIGKDD Explorations Newsletter
Sparse coding based visual tracking: Review and experimental comparison
Pattern Recognition
Supervised feature selection in graphs with path coding penalties and network flows
The Journal of Machine Learning Research
Hi-index | 0.01 |
This paper investigates a learning formulation called structured sparsity, which is a natural extension of the standard sparsity concept in statistical learning and compressive sensing. By allowing arbitrary structures on the feature set, this concept generalizes the group sparsity idea that has become popular in recent years. A general theory is developed for learning with structured sparsity, based on the notion of coding complexity associated with the structure. It is shown that if the coding complexity of the target signal is small, then one can achieve improved performance by using coding complexity regularization methods, which generalize the standard sparse regularization. Moreover, a structured greedy algorithm is proposed to efficiently solve the structured sparsity problem. It is shown that the greedy algorithm approximately solves the coding complexity optimization problem under appropriate conditions. Experiments are included to demonstrate the advantage of structured sparsity over standard sparsity on some real applications.