Notions of associative memory and sparse coding
Neural Networks - 1996 Special issue: four major hypotheses in neuroscience
Transform-invariant recognition by association in a recurrent network
Neural Computation
A recurrent model of transformation invariance by association
Neural Networks
Neural Information Processing
On the complexity of hierarchical associative memories
Proceedings of the 2009 ACM symposium on Applied Computing
Hi-index | 0.00 |
We investigated the properties of mixed states in a sparsely encoded associative memory model with a structural learning method. When mixed states are made of s memory patterns, s types of mixed states, which become equilibrium states of the model, can be generated. To investigate the properties of s types of the mixed states, we analyzed them using the statistical mechanical method. We also found that the storage capacity of the memory pattern and the storage capacity of only a particular mixed state diverge at the sparse limit. We also found that the threshold value needed to recall the memory pattern is nearly equal to the threshold value needed to recall the particular mixed state. This means that the memory pattern and the particular mixed state can be made to easily coexist at the sparse limit. The properties of the model obtained by the analysis are also useful for constructing a transform-invariant recognition model.