Border-sensitive learning in kernelized learning vector quantization

  • Authors:
  • Marika Kästner;Martin Riedel;Marc Strickert;Wieland Hermann;Thomas Villmann

  • Affiliations:
  • Computational Intelligence Group, University of Applied Sciences Mittweida, Mittweida, Germany;Computational Intelligence Group, University of Applied Sciences Mittweida, Mittweida, Germany;Computational Intelligence Group, Philipps-University Marburg, Marburg, Germany;Department of Neurology, Paracelsus Hospital Zwickau, Zwickau, Germany;Computational Intelligence Group, University of Applied Sciences Mittweida, Mittweida, Germany

  • Venue:
  • IWANN'13 Proceedings of the 12th international conference on Artificial Neural Networks: advances in computational intelligence - Volume Part I
  • Year:
  • 2013

Quantified Score

Hi-index 0.00

Visualization

Abstract

Prototype based classification approaches are powerful classifiers for class discrimination of vectorial data. Famous examples are learning vector quantization models (LVQ) and support vector machines (SVMs). In this paper we propose the application of kernel distances in LVQ such that the LVQ-algorithm can handle the data in a topologically equivalent data space compared to the feature mapping space in SVMs. Further, we provide strategies to force the LVQ-prototypes to be class border sensitive. In this way an alternative to SVMs based on Hebbian learning is established. After presenting the theoretical background, we demonstrate the abilities of the model for an illustrative toy example and for the more challenging task of classification of Wilson's disease patients according to their neurophysiological impairments.