Learning parameters of linear models in compressed parameter space

  • Authors:
  • Yohannes Kassahun;Hendrik Wöhrle;Alexander Fabisch;Marc Tabie

  • Affiliations:
  • Robotics Group, University of Bremen, Bremen, Germany;Robotics Innovation Center, DFKI GmbH, Bremen, Germany;Robotics Group, University of Bremen, Bremen, Germany;Robotics Group, University of Bremen, Bremen, Germany

  • Venue:
  • ICANN'12 Proceedings of the 22nd international conference on Artificial Neural Networks and Machine Learning - Volume Part II
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

We present a novel method of reducing the training time by learning parameters of a model at hand in compressed parameter space. In compressed parameter space the parameters of the model are represented by fewer parameters, and hence training can be faster. After training, the parameters of the model can be generated from the parameters in compressed parameter space. We show that for supervised learning, learning the parameters of a model in compressed parameter space is equivalent to learning parameters of the model in compressed input space. We have applied our method to a supervised learning domain and show that a solution can be obtained at much faster speed than learning in uncompressed parameter space. For reinforcement learning, we show empirically that searching directly the parameters of a policy in compressed parameter space accelerates learning.