Learning internal representations by error propagation
Parallel distributed processing: explorations in the microstructure of cognition, vol. 1
Testing for nonlinearity in time series: the method of surrogate data
Conference proceedings on Interpretation of time series from nonlinear mechanical systems
Associative dynamics in a chaotic neural network
Neural Networks
The error surface of the simplest xor network has only global minima
Neural Computation
Refractory effects of chaotic neurodynamics for finding motifs from DNA sequences
IDEAL'06 Proceedings of the 7th international conference on Intelligent Data Engineering and Automated Learning
An analysis of noise in recurrent neural networks: convergence and generalization
IEEE Transactions on Neural Networks
Training neural networks with additive noise in the desired signal
IEEE Transactions on Neural Networks
Chaotifying linear Elman networks
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
The primary purpose of this study is to reveal the effects of refractoriness on learning performance. We simulated that Elman network, which consists of chaotic neurons, learns a pattern sequence using the back-propagation algorithm. Consequently, the learning speed was accelerated about 46% compared with that of the network consisting of integrate-and-fire model neurons. In addition, we analyzed the required number of hidden neurons, asynchronous activities of hidden neurons' refractoriness, and correlation coefficients of synaptic weights after learning. These results suggested that the refractoriness contributes to efficient encoding in the hidden layer of Elman network.