The power of amnesia: learning probabilistic automata with variable memory length
Machine Learning - Special issue on COLT '94
A learning algorithm for continually running fully recurrent neural networks
Neural Computation
On learning context-free and context-sensitive languages
IEEE Transactions on Neural Networks
Markovian architectural bias of recurrent neural networks
IEEE Transactions on Neural Networks
Architectural and Markovian factors of echo state networks
Neural Networks
RaPScoM: towards composition strategies in a rapid score music prototyping framework
Proceedings of the 6th Audio Mostly Conference: A Conference on Interaction with Sound
Hi-index | 0.00 |
A lot of attention is now being focused on connectionist models known under the name "reservoir computing". The most prominent example of these approaches is a recurrent neural network architecture called an echo state network (ESN). ESNs were successfully applied in more real-valued time series modeling tasks and performed exceptionally well. Also using ESNs for processing symbolic sequences seems to be attractive. In this work we experimentally support the claim that the state space of ESN is organized according to the Markovian architectural bias principles when processing symbolic sequences. We compare performance of ESNs with connectionist models explicitly using Markovian architectural bias property, with variable length Markov models and with recurrent neural networks trained by advanced training algorithms. Moreover we show that the number of reservoir units plays a similar role as the number of contexts in variable length Markov models.