Self-organizing maps
Generalized relevance learning vector quantization
Neural Networks - New developments in self-organizing maps
Kernel Methods for Pattern Analysis
Kernel Methods for Pattern Analysis
On the Generalization Ability of GRLVQ Networks
Neural Processing Letters
The Dissimilarity Representation for Pattern Recognition: Foundations And Applications (Machine Perception and Artificial Intelligence)
Nonlinear Dimensionality Reduction
Nonlinear Dimensionality Reduction
Adaptive relevance matrices in learning vector quantization
Neural Computation
Representation of functional data in neural networks
Neurocomputing
Divergence-based vector quantization
Neural Computation
Border-sensitive learning in kernelized learning vector quantization
IWANN'13 Proceedings of the 12th international conference on Artificial Neural Networks: advances in computational intelligence - Volume Part I
Hi-index | 0.01 |
Relevance learning in learning vector quantization is a central paradigm for classification task depending feature weighting and selection. We propose a functional approach to relevance learning for high-dimensional functional data. For this purpose we compose the relevance profile by a superposition of only a few parametrized basis functions taking into account the functional character of the data. The number of these parameters is usually significantly smaller than the number of relevance weights in standard relevance learning, which is the number of data dimensions. Thus, instabilities in learning are avoided and an inherent regularization takes place. In addition, we discuss strategies to obtain sparse relevance models for further model optimization.