Dynamic cell structure learns perfectly topology preserving map

  • Authors:
  • Jörg Bruske;Gerald Sommer

  • Affiliations:
  • -;-

  • Venue:
  • Neural Computation
  • Year:
  • 1995

Quantified Score

Hi-index 0.00

Visualization

Abstract

Dynamic cell structures (DCS) represent a family ofartificial neural architectures suited both for unsupervisedand supervised learning. They belong to the recently(Martinetz 1994) introduced class of topology representingnetworks (TRN) that build perfectly topology preservingfeature maps. DCS employ a modified Kohonen learningrule in conjunction with competitive Hebbian learning.The Kohonen type learning rule serves to adjust the synaptic weightvectors while Hebbian learning establishes a dynamic lateralconnection structure between the units reflecting the topologyof the feature manifold. In case of supervised learning, i.e.,function approximation, each neural unit implements a radialbasis function, and an additional layer of linear output unitsadjusts according to a delta-rule. DCS is the firstRBF-based approximation scheme attempting to concurrently learn andutilize a perfectly topology preserving map for improvedperformance. Simulations on a selection of CMU-Benchmarks indicatethat the DCS idea applied to the growing cell structurealgorithm (Fritzke 1993c) leads to an efficient and elegantalgorithm that can beat conventional models on similar tasks.