Expansive competitive learning for kernel vector quantization

  • Authors:
  • Davide Bacciu;Antonina Starita

  • Affiliations:
  • IMT Lucca Institute for Advanced Studies, P.zza San Ponziano 6, 55100 Lucca, Italy and Dipartimento di Informatica, Universití di Pisa, Largo B. Pontecorvo 3, 56127 Pisa, Italy;Dipartimento di Informatica, Universití di Pisa, Largo B. Pontecorvo 3, 56127 Pisa, Italy

  • Venue:
  • Pattern Recognition Letters
  • Year:
  • 2009

Quantified Score

Hi-index 0.10

Visualization

Abstract

In this paper we present a necessary and sufficient condition for global optimality of unsupervised Learning Vector Quantization (LVQ) in kernel space. In particular, we generalize the results presented for expansive and competitive learning for vector quantization in Euclidean space, to the general case of a kernel-based distance metric. Based on this result, we present a novel kernel LVQ algorithm with an update rule consisting of two terms: the former regulates the force of attraction between the synaptic weight vectors and the inputs; the latter, regulates the repulsion between the weights and the center of gravity of the dataset. We show how this algorithm pursues global optimality of the quantization error by means of the repulsion mechanism. Simulation results are provided to show the performance of the model on common image quantization tasks: in particular, the algorithm is shown to have a superior performance with respect to recently published quantization models such as Enhanced LBG [Patane, G., Russo, M., 2001. The enhanced LBG algorithm. Neural Networks 14 (9), 1219-1237] and Adaptive Incremental LBG [Shen, F., Hasegawa, O., 2006. An adaptive incremental LBG for vector quantization. Neural Networks 19 (5), 694-704].