Expansive and Competitive Neural Networks

  • Authors:
  • José Antonio Gómez-Ruiz;José Muñoz-Perez;Ezequiel López-Rubio;M. A. Garcia-Bernal

  • Affiliations:
  • -;-;-;-

  • Venue:
  • IWANN '01 Proceedings of the 6th International Work-Conference on Artificial and Natural Neural Networks: Connectionist Models of Neurons, Learning Processes and Artificial Intelligence-Part I
  • Year:
  • 2001

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this paper, we develop a necessary and sufficient condition for a local minimum to be a global minimum to the vector quantization problem and present a competitive learning algorithm based on this condition which has two learning terms; the first term regulates the force of attraction between the synaptic weight vectors and the input patterns in order to reach a local minimum while the second term regulates the repulsion between the synaptic weight vectors and the input's gravity center, favoring convergence to the global minimum. This algorithm leads to optimal or near optimal solutions and it allows the network to escape from local minima during trainning. Experimental results in image compression demonstrate that it outperforms the simple competitive learning algorithm, giving better codebooks.