Architectural bias in recurrent neural networks: fractal analysis
Neural Computation
From controlled dynamical systems to context-dependent grammars: A connectionist approach
Engineering Applications of Artificial Intelligence
ICANN'07 Proceedings of the 17th international conference on Artificial neural networks
Training recurrent connectionist models on symbolic time series
ICONIP'08 Proceedings of the 15th international conference on Advances in neuro-information processing - Volume Part I
Hi-index | 0.00 |
The long short-term memory (LSTM) is not the only neural network which learns a context sensitive language. Second-order sequential cascaded networks (SCNs) are able to induce means from a finite fragment of a context-sensitive language for processing strings outside the training set. The dynamical behavior of the SCN is qualitatively distinct from that observed in LSTM networks. Differences in performance and dynamics are discussed