Recursive distributed representations
Artificial Intelligence - On connectionist symbol processing
Computability with low-dimensional dynamical systems
Theoretical Computer Science
On the computational power of neural nets
Journal of Computer and System Sciences
Dynamical recognizers: real time language recognition by analog computers
Theoretical Computer Science
Linear recursive distributed representations
Neural Networks
Neural Computation
A learning algorithm for continually running fully recurrent neural networks
Neural Computation
Learning long-term dependencies with gradient descent is difficult
IEEE Transactions on Neural Networks
Evolving hypernetwork models of binary time series for forecasting price movements on stock markets
CEC'09 Proceedings of the Eleventh conference on Congress on Evolutionary Computation
Exact solutions for recursive principal components analysis of sequences and trees
ICANN'06 Proceedings of the 16th international conference on Artificial Neural Networks - Volume Part I
Hi-index | 0.00 |
A recurrent linear network can be trained with Oja's constrained Hebbian learning rule. As a result, the network learns to represent the temporal context associated to its input sequence. The operation performed by the network is a generalization of Principal Components Analysis (PCA) to time-series, called Recursive PCA. The representations learned by the network are adapted to the temporal statistics of the input. Moreover, sequences stored in the network may be retrieved explicitly, in the reverse order of presentation, thus providing a straight-forward neural implementation of a logical stack.