An incremental network for on-line unsupervised classification and topology learning

  • Authors:
  • Shen Furao;Osamu Hasegawa

  • Affiliations:
  • Department of Computational Intelligence and Systems Science, Tokyo Institute of TechnologyR2-52, 4259 Nagatsuta, Midori-ku, Yokohama 226-8503, Japan;Imaging Science and Engineering Lab., Tokyo Institute of Technology and PRESTO, Japan Science and Technology Agency (JST), Japan

  • Venue:
  • Neural Networks
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper presents an on-line unsupervised learning mechanism for unlabeled data that are polluted by noise. Using a similarity threshold-based and a local error-based insertion criterion, the system is able to grow incrementally and to accommodate input patterns of on-line non-stationary data distribution. A definition of a utility parameter, the error-radius, allows this system to learn the number of nodes needed to solve a task. The use of a new technique for removing nodes in low probability density regions can separate clusters with low-density overlaps and dynamically eliminate noise in the input data. The design of two-layer neural network enables this system to represent the topological structure of unsupervised on-line data, report the reasonable number of clusters, and give typical prototype patterns of every cluster without prior conditions such as a suitable number of nodes or a good initial codebook.