Energy-based models for sparse overcomplete representations

  • Authors:
  • Yee Whye Teh;Max Welling;Simon Osindero;Geoffrey E. Hinton

  • Affiliations:
  • Department of Computer Science, University of Toronto, 10 King's College Road, Toronto M5S 3G4, Canada;Department of Computer Science, University of Toronto, 10 King's College Road, Toronto M5S 3G4, Canada;Gatsby Computational Neuroscience Unit, University College London, 17 Queen Square, London WC1N 3AR, United Kingdom;Department of Computer Science, University of Toronto, 10 King's College Road, Toronto M5S 3G4, Canada

  • Venue:
  • The Journal of Machine Learning Research
  • Year:
  • 2003

Quantified Score

Hi-index 0.00

Visualization

Abstract

We present a new way of extending independent components analysis(ICA) to overcomplete representations. In contrast to the causalgenerative extensions of ICA which maintain marginal independenceof sources, we define features as deterministic(linear) functions of the inputs. This assumption results inmarginal dependencies among the features, but conditionalindependence of the features given the inputs. By assigningenergies to the features a probability distribution over the inputstates is defined through the Boltzmann distribution. Freeparameters of this model are trained using the contrastivedivergence objective (Hinton, 2002). When the number of features isequal to the number of input dimensions this energy-based modelreduces to noiseless ICA and we show experimentally that theproposed learning algorithm is able to perform blind sourceseparation on speech data. In additional experiments we trainovercomplete energy-based models to extract features from variousstandard data-sets containing speech, natural images, hand-writtendigits and faces.