Learning sequential structure with the real-time recurrent learning algorithm
International Journal of Neural Systems
Neural Networks: A Comprehensive Foundation
Neural Networks: A Comprehensive Foundation
Online Text Prediction with Recurrent Neural Networks
Neural Processing Letters
Finite state automata and simple recurrent networks
Neural Computation
A learning algorithm for continually running fully recurrent neural networks
Neural Computation
Sequential neural text compression
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
This paper studies the use of discrete-time recurrent neural networks for predicting the next symbol in a sequence. The focus is on online prediction, a task much harder than the classical offline grammatical inference with neural networks. The results obtained show that the performance of recurrent networks working online is acceptable when sequences come from finite-state machines or even from some chaotic sources. When predicting texts in human language, however, dynamics seem to be too complex to be correctly learned in real-time by the net. Two algorithms are considered for network training: real-time recurrent learning and the decoupled extended Kalman filter.