A training algorithm for optimal margin classifiers
COLT '92 Proceedings of the fifth annual workshop on Computational learning theory
The nature of statistical learning theory
The nature of statistical learning theory
Computation with infinite neural networks
Neural Computation
Bayesian Learning for Neural Networks
Bayesian Learning for Neural Networks
Gaussian Processes for Machine Learning (Adaptive Computation and Machine Learning)
Gaussian Processes for Machine Learning (Adaptive Computation and Machine Learning)
Pattern Recognition and Machine Learning (Information Science and Statistics)
Pattern Recognition and Machine Learning (Information Science and Statistics)
Training Recurrent Networks by Evolino
Neural Computation
Large-Scale Kernel Machines (Neural Information Processing)
Large-Scale Kernel Machines (Neural Information Processing)
Large-margin classification in infinite neural networks
Neural Computation
Memory in backpropagation-decorrelation O(N) efficient online recurrent learning
ICANN'05 Proceedings of the 15th international conference on Artificial neural networks: formal models and their applications - Volume Part II
Recurrent neural networks are universal approximators
ICANN'06 Proceedings of the 16th international conference on Artificial Neural Networks - Volume Part I
Survey: Reservoir computing approaches to recurrent neural network training
Computer Science Review
New results on recurrent network training: unifying the algorithms and accelerating convergence
IEEE Transactions on Neural Networks
Support Vector Echo-State Machine for Chaotic Time-Series Prediction
IEEE Transactions on Neural Networks
Gradient calculations for dynamic recurrent neural networks: a survey
IEEE Transactions on Neural Networks
Infinite sparse threshold unit networks
ICANN'12 Proceedings of the 22nd international conference on Artificial Neural Networks and Machine Learning - Volume Part I
Gesture unit segmentation using support vector machines: segmenting gestures from rest positions
Proceedings of the 28th Annual ACM Symposium on Applied Computing
Quantifying the reliability of fault classifiers
Information Sciences: an International Journal
Hi-index | 0.02 |
Echo state networks (ESNs) are large, random recurrent neural networks with a single trained linear readout layer. Despite the untrained nature of the recurrent weights, they are capable of performing universal computations on temporal input data, which makes them interesting for both theoretical research and practical applications. The key to their success lies in the fact that the network computes a broad set of nonlinear, spatiotemporal mappings of the input data, on which linear regression or classification can easily be performed. One could consider the reservoir as a spatiotemporal kernel, in which the mapping to a high-dimensional space is computed explicitly. In this letter, we build on this idea and extend the concept of ESNs to infinite-sized recurrent neural networks, which can be considered recursive kernels that subsequently can be used to create recursive support vector machines. We present the theoretical framework, provide several practical examples of recursive kernels, and apply them to typical temporal tasks.