The recurrent cascade-correlation architecture
NIPS-3 Proceedings of the 1990 conference on Advances in neural information processing systems 3
Induction of finite-state languages using second-order recurrent networks
Neural Computation
Learning Sequential Tasks by Incrementally Adding Higher Orders
Advances in Neural Information Processing Systems 5, [NIPS Conference]
Neural Computation
Hi-index | 0.00 |
Traditional Recurrent Neural Networks (RNNs) perform poorly on learning tasks involving long time-lag dependencies. More recent approaches such as LSTM and its variants significantly improve on RNNs ability to learn this type of problem. We present an alternative approach to encoding temporal dependencies that associates temporal features with nodes rather than state values, where the nodes explicitly encode dependencies over variable time delays. We show promising results comparing the network's performance to LSTM variants on an extended Reber grammar task.