A practical Bayesian framework for backpropagation networks
Neural Computation
Matrix computations (3rd ed.)
Extended Kalman filter-based pruning method for recurrent neural networks
Neural Computation
Outcomes of the equivalence of adaptive ridge with least absolute shrinkage
Proceedings of the 1998 conference on Advances in neural information processing systems II
NETLAB: algorithms for pattern recognition
NETLAB: algorithms for pattern recognition
Neural Networks for Pattern Recognition
Neural Networks for Pattern Recognition
Bayesian learning for neural networks
Bayesian learning for neural networks
Adaptive Learning of Polynomial Networks: Genetic Programming, Backpropagation and Bayesian Methods (Genetic and Evolutionary Computation)
A learning algorithm for continually running fully recurrent neural networks
Neural Computation
A smoothing regularizer for feedforward and recurrent neural networks
Neural Computation
IEEE Transactions on Signal Processing
An analysis of noise in recurrent neural networks: convergence and generalization
IEEE Transactions on Neural Networks
Financial volatility trading using recurrent neural networks
IEEE Transactions on Neural Networks
User behavior pattern analysis and prediction based on mobile phone sensors
NPC'10 Proceedings of the 2010 IFIP international conference on Network and parallel computing
Elman-style process neural network with application to aircraft engine health condition monitoring
ISNN'11 Proceedings of the 8th international conference on Advances in neural networks - Volume Part I
Recurrent sparse support vector regression machines trained by active learning in the time-domain
Expert Systems with Applications: An International Journal
Differential constraints for bounded recursive identification with multivariate splines
Automatica (Journal of IFAC)
Recurrent neural network-based control for wastewater treatment process
ISNN'12 Proceedings of the 9th international conference on Advances in Neural Networks - Volume Part II
Hi-index | 0.01 |
This paper develops a probabilistic approach to recursive second-order training of recurrent neural networks (RNNs) for improved time-series modeling. A general recursive Bayesian Levenberg-Marquardt algorithm is derived to sequentially update the weights and the covariance (Hessian) matrix. The main strengths of the approach are a principled handling of the regularization hyperparameters that leads to better generalization, and stable numerical performance. The framework involves the adaptation of a noise hyperparameter and local weight prior hyperparameters, which represent the noise in the data and the uncertainties in the model parameters. Experimental investigations using artificial and real-world data sets show that RNNs equipped with the proposed approach outperform standard real-time recurrent learning and extended Kalman training algorithms for recurrent networks, as well as other contemporary nonlinear neural models, on time-series modeling.