The capacity of the Hopfield associative memory
IEEE Transactions on Information Theory
A competitive modular connectionist architecture
NIPS-3 Proceedings of the 1990 conference on Advances in neural information processing systems 3
AIPR '01 Proceedings of the 30th on Applied Imagery Pattern Recognition Workshop
Modular recurrent neural networks for Mandarin syllable recognition
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
In this paper, a two-dimensional modular architecture for Hopfield neural network and a distance based training algorithm that improves the storage capacity and reduces the structural complexity of Hopfield neural network are presented. Our approach involve dividing a N×M network into (N×M)/(n×m) modules of size n×m with each module functioning independently as a sub-network in conjunction with the inputs from neighboring modules. In this technique, a divide and conquer approach, which permits us to solve a complex computational task by dividing it into simpler subtasks and then combining their individual solutions is utilized. This use of task- decomposition provides a modular structure with several advantages, viz. more reasonable generalization, more intelligible and useful representations, and more efficient use of computational hardware. The performance of the proposed technique is evaluated by applying it into various character images. It has been observed that the network exhibits faster convergence characteristics and is capable of reproducing learned patterns successfully from noisy and partial data.