Associative memory in a multimodular network
Neural Computation
Towards cortex sized artificial neural systems
Neural Networks
Neural associative memory with optimal bayesian learning
Neural Computation
Integer sparse distributed memory: Analysis and results
Neural Networks
Hi-index | 0.00 |
We investigate attractor neural networks with a modular structure, where a local winner-takes-all rule acts within the modules (called hyper-columns). We make a signal-to-noise analysis of storage capacity and noise tolerance, and compare the results with those from simulations. Introducing local winner-takes-all dynamics improves storage capacity and noise tolerance, while the optimal size of the hypercolumns depends on network size and noise level.