Magnification control for batch neural gas

  • Authors:
  • Barbara Hammer;Alexander Hasenfuss;Thomas Villmann

  • Affiliations:
  • Institute of Computer Science, Clausthal University of Technology, Germany;Institute of Computer Science, Clausthal University of Technology, Germany;Clinic for Psychotherapy, Universität Leipzig, Germany

  • Venue:
  • Neurocomputing
  • Year:
  • 2007

Quantified Score

Hi-index 0.01

Visualization

Abstract

Neural gas (NG) constitutes a very robust clustering algorithm which can be derived as stochastic gradient descent from a cost function closely connected to the quantization error. In the limit, an NG network samples the underlying data distribution. Thereby, the connection is not linear, rather, it follows a power law with magnification exponent different from the information theoretically optimum one in adaptive map formation. There exists a couple of schemes to explicitly control the exponent such as local learning which leads to a small change of the learning algorithm of NG. Batch NG constitutes a fast alternative optimization scheme for NG vector quantizers which has been derived from the same cost function and which constitutes a fast Newton optimization scheme. It possesses the same magnification factor (different from 1) as standard online NG. In this paper, we propose a method to integrate magnification control by local learning into batch NG. Thereby, the key observation is a link of local learning to an underlying cost function which opens the way towards alternative, e.g.batch optimization schemes. We validate the learning rule derived from this altered cost function in an artificial experimental setting and we demonstrate the benefit of magnification control to sample rare events for a real data set.