Representation-burden Conservation Network Applied to Learning VQ (NPL270)
Neural Processing Letters
Improved Representation-burden Conservation Network for LearningNon-stationary VQ
Neural Processing Letters
A Novel Self-Creating Neural Network for Learning Vector Quantization
Neural Processing Letters
Efficient Vector Quantization Using the WTA-Rule with Activity Equalization
Neural Processing Letters
Hill-Climbing, Density-Based Clustering and Equiprobabilistic Topographic Maps
Journal of VLSI Signal Processing Systems
A Multi-purpose Visual Classification System
Proceedings of the International Conference, 7th Fuzzy Days on Computational Intelligence, Theory and Applications
Magnification Control in Self-Organizing Maps and Neural Gas
Neural Computation
Hi-index | 0.00 |
We study the codeword distribution for a conscience-type competitive learning algorithm, frequency sensitive competitive learning (FSCL), using one-dimensional input data. We prove that the asymptotic codeword density in the limit of large number of codewords is given by a power law of the form Q(x)=C·P(x)α, where P(x) is the input data density and α depends on the algorithm and the form of the distortion measure to be minimized. We further show that the algorithm can be adjusted to minimize any Lp distortion measure with p ranging in (0,2]