Learning internal representations by error propagation
Parallel distributed processing: explorations in the microstructure of cognition, vol. 1
Delayed Learning and the Organized States
KES '07 Knowledge-Based Intelligent Information and Engineering Systems and the XVII Italian Workshop on Neural Networks on Proceedings of the 11th International Conference
Hi-index | 0.00 |
Elman presented a network with a context layer for the time-series processing. The context layer is connected to the hidden layer for the next calculation of the time series, which keeps the output of the hidden layer. In this paper, the context layer is reformed to the internal memory layer, which is connected from the hidden layer with the connection weights to make the internal memory. Then, the internal memory plays an important role of the learning of the time series. We developed a new learning algorithm, called the time-delayed back-propagation learning, for the internal memory. The ability of the network with the internal memory layer is demonstrated by applying the simple sinusoidal time-series.