Reformulating Learning Vector Quantization and Radial Basis Neural Networks

  • Authors:
  • Nicolaos B. Karayiannis

  • Affiliations:
  • (Correspd.) Department of Electrical and Computer Engineering, University of Houston, Houston, Texas 77204-4793, USA. Karayiannis@UH.EDU

  • Venue:
  • Fundamenta Informaticae
  • Year:
  • 1999

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper proposes a framework for developing a broad variety of soft clustering and learning vector quantization (LVQ) algorithms based on gradient descent minimization of a reformulation function. According to the proposed axiomatic approach to learning vector quantization, the development of specific algorithms reduces to the selection of a generator function. A linear generator function leads to the fuzzy c-means (FCM) and fuzzy LVQ (FLVQ) algorithms while an exponential generator function leads to entropy constrained fuzzy clustering (ECFC) and entropy constrained LVQ (ECLVQ) algorithms. The reformulation of clustering and LVQ algorithms is also extended to supervised learning models through an axiomatic approach proposed for reformulating radial basis function (RBF) neural networks. This approach results in a broad variety of admissible RBF models, while the form of the radial basis functions is determined by a generator function. This paper shows that gradient descent learning makes reformulated RBF neural networks an attractive alternative to conventional feed-forward neural networks.