Expansive and Competitive Learning for Vector Quantization

  • Authors:
  • J. Muñoz-Perez;J. A. Gomez-Ruiz;E. Lopez-Rubio;M. A. Garcia-Bernal

  • Affiliations:
  • Dept. of Computer Science and Artificial Intelligence, University of Malaga, E.T.S. Ingenieria Informatica, Campus de Teatinos s/n. 29071 Malaga, Spain;Dept. of Computer Science and Artificial Intelligence, University of Malaga, E.T.S. Ingenieria Informatica, Campus de Teatinos s/n. 29071 Malaga, Spain;Dept. of Computer Science and Artificial Intelligence, University of Malaga, E.T.S. Ingenieria Informatica, Campus de Teatinos s/n. 29071 Malaga, Spain;Dept. of Computer Science and Artificial Intelligence, University of Malaga, E.T.S. Ingenieria Informatica, Campus de Teatinos s/n. 29071 Malaga, Spain

  • Venue:
  • Neural Processing Letters
  • Year:
  • 2002

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this paper, we develop a necessary and sufficient condition for a local minimum to be a global minimum to the vector quantization problem and present a competitive learning algorithm based on this condition which has two learning terms; the first term regulates the force of attraction between the synaptic weight vectors and the input patterns in order to reach a local minimum while the second term regulates the repulsion between the synaptic weight vectors and the input's gravity center to favor convergence to the global minimum This algorithm leads to optimal or near optimal solutions and it allows the network to escape from local minima during training. Experimental results in image compression demonstrate that it outperforms the simple competitive learning algorithm, giving better codebooks.