IEEE Transactions on Image Processing
Efficient Online and Batch Learning Using Forward Backward Splitting
The Journal of Machine Learning Research
Online Learning for Matrix Factorization and Sparse Coding
The Journal of Machine Learning Research
On the convergence of the block nonlinear Gauss-Seidel method under convex constraints
Operations Research Letters
Hi-index | 0.00 |
Recently, considerable research efforts have been devoted to the design of methods to learn from data overcomplete dictionaries for sparse coding. However, learned dictionaries require the solution of a sparse approximation problem for coding new data. In order to overcome this drawback, we propose an algorithm aimed at learning both a dictionary and its dual: a linear mapping directly performing the coding. Our algorithm is based on proximal methods and jointly minimizes the reconstruction error of the dictionary and the coding error of its dual; the sparsity of the representation is induced by an l1-based penalty on its coefficients. Experimental results show that the algorithm is capable of recovering the expected dictionaries. Furthermore, on a benchmark dataset the image features obtained from the dual matrix yield state-of-the-art classification performance while being much less computational intensive.