An algorithm for solving a special class of tridiagonal systems of linear equations
Communications of the ACM
Convergence of a block coordinate descent method for nondifferentiable minimization
Journal of Optimization Theory and Applications
Newton's Method for Large Bound-Constrained Optimization Problems
SIAM Journal on Optimization
Practical Aspects of the Moreau--Yosida Regularization: Theoretical Preliminaries
SIAM Journal on Optimization
Box Constrained Quadratic Programming with Proportioning and Projections
SIAM Journal on Optimization
Computational Optimization and Applications
Classification of arrayCGH data using fused SVM
Bioinformatics
A coordinate gradient descent method for nonsmooth separable minimization
Mathematical Programming: Series A and B
Primal-dual subgradient methods for convex problems
Mathematical Programming: Series A and B - Series B - Special Issue: Nonsmooth Optimization and Applications
An accelerated gradient method for trace norm minimization
ICML '09 Proceedings of the 26th Annual International Conference on Machine Learning
Large-scale sparse logistic regression
Proceedings of the 15th ACM SIGKDD international conference on Knowledge discovery and data mining
A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
SIAM Journal on Imaging Sciences
A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
SIAM Journal on Imaging Sciences
Efficient Online and Batch Learning Using Forward Backward Splitting
The Journal of Machine Learning Research
Multi-task feature learning via efficient l2, 1-norm minimization
UAI '09 Proceedings of the Twenty-Fifth Conference on Uncertainty in Artificial Intelligence
Integrating low-rank and group-sparse structures for robust multi-task learning
Proceedings of the 17th ACM SIGKDD international conference on Knowledge discovery and data mining
HIS'12 Proceedings of the First international conference on Health Information Science
A sparsity-inducing formulation for evolutionary co-clustering
Proceedings of the 18th ACM SIGKDD international conference on Knowledge discovery and data mining
Robust multi-task feature learning
Proceedings of the 18th ACM SIGKDD international conference on Knowledge discovery and data mining
Modeling disease progression via fused sparse group lasso
Proceedings of the 18th ACM SIGKDD international conference on Knowledge discovery and data mining
Multi-source learning for joint analysis of incomplete multi-modality neuroimaging data
Proceedings of the 18th ACM SIGKDD international conference on Knowledge discovery and data mining
An efficient ADMM algorithm for multidimensional anisotropic total variation regularization problems
Proceedings of the 19th ACM SIGKDD international conference on Knowledge discovery and data mining
Multisample aCGH Data Analysis via Total Variation and Spectral Regularization
IEEE/ACM Transactions on Computational Biology and Bioinformatics (TCBB)
Fused Lasso and rotation invariant autoregressive models for texture classification
Pattern Recognition Letters
Hi-index | 0.00 |
The fused Lasso penalty enforces sparsity in both the coefficients and their successive differences, which is desirable for applications with features ordered in some meaningful way. The resulting problem is, however, challenging to solve, as the fused Lasso penalty is both non-smooth and non-separable. Existing algorithms have high computational complexity and do not scale to large-size problems. In this paper, we propose an Efficient Fused Lasso Algorithm (EFLA) for optimizing this class of problems. One key building block in the proposed EFLA is the Fused Lasso Signal Approximator (FLSA). To efficiently solve FLSA, we propose to reformulate it as the problem of finding an "appropriate" subgradient of the fused penalty at the minimizer, and develop a Subgradient Finding Algorithm (SFA). We further design a restart technique to accelerate the convergence of SFA, by exploiting the special "structures" of both the original and the reformulated FLSA problems. Our empirical evaluations show that, both SFA and EFLA significantly outperform existing solvers. We also demonstrate several applications of the fused Lasso.