Self-organization and associative memory: 3rd edition
Self-organization and associative memory: 3rd edition
Competitive learning algorithms for vector quantization
Neural Networks
Harmonic competition: a self-organizing multiple criteria optimization
IEEE Transactions on Neural Networks
Codeword distribution for frequency sensitive competitive learning with one-dimensional input data
IEEE Transactions on Neural Networks
Improved Representation-burden Conservation Network for LearningNon-stationary VQ
Neural Processing Letters
Efficient Vector Quantization Using the WTA-Rule with Activity Equalization
Neural Processing Letters
Combining spatial and colour information for content based image retrieval
Computer Vision and Image Understanding - Special issue on color for image indexing and retrieval
Hi-index | 0.00 |
A self-creating network effective in learning vector quantization, called RCN (Representation-burden Conservation Network) is developed. Each neuron in RCN is characterized by a measure of representation-burden. Conservation is achieved by bounding the summed representation-burden of all neurons at constant 1, as representation-burden values of all neurons are updated after each input presentation. We show that RCN effectively fulfills the conscience principle [1] and achieves biologically plausible self-development capability. In addition, conservation in representation-burden facilitates systematic derivations of learning parameters, including the adaptive learning rate control useful in accelerating the convergence as well as in improving node-utilization. Because it is smooth and incremental, RCN can overcome the stability-plasticity dilemma. Simulation results show that RCN displays superior performance over other competitive learning networks in minimizing the quantization error.