The capacity of the Hopfield associative memory
IEEE Transactions on Information Theory
A training algorithm for optimal margin classifiers
COLT '92 Proceedings of the fifth annual workshop on Computational learning theory
Self-organization using Potts models
Neural Networks
The magnitude of the diagonal elements in neural networks
Neural Networks
Loading temporal associative memory using the neuronic equation
ICANN/ICONIP'03 Proceedings of the 2003 joint international conference on Artificial neural networks and neural information processing
A novel continuous-time neural network for realizing associative memory
IEEE Transactions on Neural Networks
Modeling word perception using the Elman network
Neurocomputing
Resolving Hidden Representations
Neural Information Processing
Neural Network Method for Protein Structure Search Using Cell-Cell Adhesion
Neural Information Processing
Geometrical Perspective on Hairy Memory
ICANN '08 Proceedings of the 18th international conference on Artificial Neural Networks, Part II
Backbone structure of hairy memory
ICANN'06 Proceedings of the 16th international conference on Artificial Neural Networks - Volume Part I
ICONIP'06 Proceedings of the 13 international conference on Neural Information Processing - Volume Part I
Hi-index | 0.00 |
This paper presents a method to expand the basins of stable patterns in associative memory. It examines fully-connected associative memory geometrically and translate the learning process into an algebraic optimization procedure. It finds that locating all the patterns at certain stable corners of the neurons' hypercube as far from the decision hyperplanes as possible can produce excellent error tolerance. It then devises a method based on this finding to develop the hyperplanes. This paper further shows that this method leads to the hairy model, or the deterministic analogue of the Gibb's free energy model. Through simulations, it shows that this method gives better error tolerance than does the Hopfield model and the error-correction rule in both synchronous and asynchronous modes.