The Strength of Weak Learnability
Machine Learning
Boosting a weak learning algorithm by majority
COLT '90 Proceedings of the third annual workshop on Computational learning theory
Learning internal representations by error propagation
Parallel distributed processing: explorations in the microstructure of cognition, vol. 1
Original Contribution: Stacked generalization
Neural Networks
Machine Learning
Machine Learning
A decision-theoretic generalization of on-line learning and an application to boosting
Journal of Computer and System Sciences - Special issue: 26th annual ACM symposium on the theory of computing & STOC'94, May 23–25, 1994, and second annual Europe an conference on computational learning theory (EuroCOLT'95), March 13–15, 1995
Boosting regression estimators
Neural Computation
Boosting Methods for Regression
Machine Learning
Combining Classifiers by Constructive Induction
ECML '98 Proceedings of the 10th European Conference on Machine Learning
Improving Regressors using Boosting Techniques
ICML '97 Proceedings of the Fourteenth International Conference on Machine Learning
COLT '00 Proceedings of the Thirteenth Annual Conference on Computational Learning Theory
Multiple neural networks for a long term time series forecast
Neural Computing and Applications
A learning algorithm for continually running fully recurrent neural networks
Neural Computation
Evolino: hybrid neuroevolution / optimal linear search for sequence learning
IJCAI'05 Proceedings of the 19th international joint conference on Artificial intelligence
AAAI'96 Proceedings of the thirteenth national conference on Artificial intelligence - Volume 1
Study of the behavior of a new boosting algorithm for recurrent neural networks
ICANN'05 Proceedings of the 15th international conference on Artificial neural networks: formal models and their applications - Volume Part II
Learning long-term dependencies in NARX recurrent neural networks
IEEE Transactions on Neural Networks
A comparison between neural-network forecasting techniques-case study: river flow forecasting
IEEE Transactions on Neural Networks
Discrete-time backpropagation for training synaptic delay-based artificial neural networks
IEEE Transactions on Neural Networks
`Neural-gas' network for vector quantization and its application to time-series prediction
IEEE Transactions on Neural Networks
Evolving recurrent perceptrons for time-series modeling
IEEE Transactions on Neural Networks
Financial time series forecast using neural network ensembles
ICSI'11 Proceedings of the Second international conference on Advances in swarm intelligence - Volume Part I
Long-memory time series ensembles for concept shift detection
Proceedings of the 2nd International Workshop on Big Data, Streams and Heterogeneous Source Mining: Algorithms, Systems, Programming Models and Applications
Hi-index | 0.00 |
Ensemble methods for classification and regression have focused a great deal of attention in recent years. They have shown, both theoretically and empirically, that they are able to perform substantially better than single models in a wide range of tasks. We have adapted an ensemble method to the problem of predicting future values of time series using recurrent neural networks (RNNs) as base learners. The improvement is made by combining a large number of RNNs, each of which is generated by training on a different set of examples. This algorithm is based on the boosting algorithm where difficult points of the time series are concentrated on during the learning process however, unlike the original algorithm, we introduce a new parameter for tuning the boosting influence on available examples. We test our boosting algorithm for RNNs on single-step-ahead and multi-step-ahead prediction problems. The results are then compared to other regression methods, including those of different local approaches. The overall results obtained through our ensemble method are more accurate than those obtained through the standard method, backpropagation through time, on these datasets and perform significantly better even when long-range dependencies play an important role.