Soft-competitive learning of sparse codes and its application to image reconstruction

  • Authors:
  • Kai Labusch;Erhardt Barth;Thomas Martinetz

  • Affiliations:
  • University of Lübeck, Institute for Neuro- and Bioinformatics, Ratzeburger Allee 160, 23538 Lübeck, Germany;University of Lübeck, Institute for Neuro- and Bioinformatics, Ratzeburger Allee 160, 23538 Lübeck, Germany;University of Lübeck, Institute for Neuro- and Bioinformatics, Ratzeburger Allee 160, 23538 Lübeck, Germany

  • Venue:
  • Neurocomputing
  • Year:
  • 2011

Quantified Score

Hi-index 0.01

Visualization

Abstract

We propose a new algorithm for the design of overcomplete dictionaries for sparse coding, neural gas for dictionary learning (NGDL), which uses a set of solutions for the sparse coefficients in each update step of the dictionary. In order to obtain such a set of solutions, we additionally propose the bag of pursuits (BOP) method for sparse approximation. Using BOP in order to determine the coefficients of the dictionary, we show in an image encoding experiment that in case of limited training data and limited computation time the NGDL update of the dictionary performs better than the standard gradient approach that is used for instance in the Sparsenet algorithm, or other state-of-the-art methods for dictionary learning such as the method of optimal directions (MOD) or the widely used K-SVD algorithm. In an application to image reconstruction, dictionaries trained with this algorithm outperform not only overcomplete Haar-wavelets and overcomplete discrete cosine transformations, but also dictionaries obtained with widely used algorithms like K-SVD.