Neural networks and the bias/variance dilemma
Neural Computation
Neural Networks: A Comprehensive Foundation
Neural Networks: A Comprehensive Foundation
Learning and Soft Computing: Support Vector Machines, Neural Networks, and Fuzzy Logic Models
Learning and Soft Computing: Support Vector Machines, Neural Networks, and Fuzzy Logic Models
Time Series Analysis: Forecasting and Control
Time Series Analysis: Forecasting and Control
Neural Networks: Tricks of the Trade, this book is an outgrowth of a 1996 NIPS workshop
Time series forecasting: Obtaining long term trends with self-organizing maps
Pattern Recognition Letters - Special issue: Artificial neural networks in pattern recognition
Neural Networks - 2004 Special issue: New developments in self-organizing systems
Feedforward Neural Network Construction Using Cross Validation
Neural Computation
Nonstationary time series prediction using local models based on competitive neural networks
IEA/AIE'2004 Proceedings of the 17th international conference on Innovations in applied artificial intelligence
Geometrical synthesis of MLP neural networks
Neurocomputing
Neural network architecture selection: can function complexity help?
Neural Processing Letters
High-order and multilayer perceptron initialization
IEEE Transactions on Neural Networks
Clustering of the self-organizing map
IEEE Transactions on Neural Networks
A hybrid linear-neural model for time series forecasting
IEEE Transactions on Neural Networks
Survey of clustering algorithms
IEEE Transactions on Neural Networks
Fuzzy artificial neural network p, d, q model for incomplete financial time series forecasting
Journal of Intelligent & Fuzzy Systems: Applications in Engineering and Technology
Hi-index | 0.01 |
In this paper, a short-term load forecasting method is considered, which is based upon a flexible smooth transition autoregressive (STAR) model. The described model is a linear model with time varying coefficients, which are the outputs of a single hidden layer feedforward neural network. The hidden layer is responsible for partitioning the input space into multiple sub-spaces through multivariate thresholds and smooth transition between the sub-spaces. In this paper, we propose a new method to smartly initialize the weights of the hidden layer of the neural network before its training. A self-organizing map (SOM) network is applied to split the historical data dynamics into clusters, and the Ho-Kashyap algorithm is then used to obtain the separating planes' equations. Applied to the electricity markets, the proposed method is better able to model the smooth transitions between the different regimes, which are present in the load demand series because of market effects and season effects. We use data from three electricity markets to compare the prediction accuracy of the proposed method with traditional benchmarks and other recent models, and find our results to be competitive.