Dynamic competitive learning

  • Authors:
  • Seongwon Cho;Jaemin Kim;Sun-Tae Chung

  • Affiliations:
  • School of Electronic and Electrical Engineering, Hongik University, Seoul, Korea;School of Electronic and Electrical Engineering, Hongik University, Seoul, Korea;School of Electronic Engineering, Soongsil University, Seoul, Korea

  • Venue:
  • ISNN'06 Proceedings of the Third international conference on Advances in Neural Networks - Volume Part I
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this paper, a new competitive learning algorithm called Dynamic Competitive Learning (DCL) is presented. DCL is a supervised learning method that dynamically generates output neurons and initializes automatically the weight vectors from training patterns. It introduces a new parameter called LOG (Limit of Grade) to decide whether an output neuron is created or not. If the class of at least one among LOG number of nearest output neurons is the same as the class of the present training pattern, then DCL adjusts the weight vector associated with the output neuron to learn the pattern. If the classes of all the nearest output neurons are different from the class of the training pattern, a new output neuron is created and the given training pattern is used to initialize the weight vector of the created neuron. The proposed method is significantly different from the previous competitive learning algorithms in the point that the selected neuron for learning is not limited only to the winner and the output neurons are dynamically generated during the learning process. In addition, the proposed algorithm has a small number of parameters, which are easy to be determined and applied to real-world problems. Experimental results demon- strate the superiority of DCL in comparison to the conventional competitive learning methods.