The capacity of the Hopfield associative memory
IEEE Transactions on Information Theory
Bidirectional associative memories
IEEE Transactions on Systems, Man and Cybernetics
Modeling brain function—the world of attractor neural networks
Modeling brain function—the world of attractor neural networks
Parallel distributed processing: explorations in the microstructure of cognition, vol. 1: foundations
Discrete-time signal processing (2nd ed.)
Discrete-time signal processing (2nd ed.)
Hi-index | 0.00 |
This paper addresses the design and experimental characterization of a novel hybricl neural network, in which two clistinct classical architectures interact: the Hopfield neural network and the Multi-Layer Perceptron. This hybrid neural system, named MLP+H (Srom MLP + Hopfield), presents a better performance than each one of the two classical architectures when considered in separate. In addition, it has the ability to deal with different classes of data than that nonnally allowed by the two conventional architectures. For example, while the Hopfield networks deal with binary patterns and the MLPs with information that always have some analog characteristics (due to the continuous nature of the MLP nodes), the MLP+H deals with analog inputs and purely digital outputs. Moreover, the MLP+H allows reduced training times when compared with the MLP architecture, also presenting compactness and flexibility in dealing with different applications, as implementation of anti-noise filters, an application currently under study.