Learning and memory properties in fully connected networks
AIP Conference Proceedings 151 on Neural Networks for Computing
Forgetting as a way to improve neural-net behavior
AIP Conference Proceedings 151 on Neural Networks for Computing
Web proxy cache replacement scheme based on back-propagation neural network
Journal of Systems and Software
One shot associative memory method for distorted pattern recognition
AI'07 Proceedings of the 20th Australian joint conference on Advances in artificial intelligence
Hi-index | 0.01 |
Three simple general purpose networks are tested for pattern classification on an optical character recognition problem. The feed-forward (multi-layer perceptron) network, the Hopfield network and a competitive learning network are compared. The input patterns are obtained by optically scanning images of printed digits and uppercase letters. The resulting data is used as input for the networks with two-state input nodes; for others, features are extracted by template matching and pixel counting. The classification capabilities of the networks are compared with a nearest neighbour algorithm applied to the same feature vectors. The feed-forward network reaches the same recognition rates as the nearest neighbour algorithm, even when only a small percentage of the possible connections is used. The Hopfield network performs less well, and overloading of the network remains a problem. Recognition rates with the competitive learning network, if input patterns are clustered well, are again as high as the nearest neighbour algorithm.