Nonlinear component analysis as a kernel eigenvalue problem
Neural Computation
Expansive and Competitive Learning for Vector Quantization
Neural Processing Letters
Competitive repetition-suppression (core) learning
ICANN'06 Proceedings of the 16th international conference on Artificial Neural Networks - Volume Part I
A cost-function approach to rival penalized competitive learning (RPCL)
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Input space versus feature space in kernel-based methods
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
Competitive Repetition-suppression (CoRe) clustering is a bio-inspired learning algorithm that is capable of automatically determining the unknown cluster number from the data. In a previous work it has been shown how CoRe clustering represents a robust generalization of rival penalized competitive learning (RPCL) by means of M-estimators. This paper studies the convergence behavior of the CoRe model, based on the analysis proposed for the distance-sensitive RPCL (DSRPCL) algorithm. Furthermore, it is proposed a global minimum criterion for learning vector quantization in kernel space that is used to assess the correct location property for the CoRe algorithm.