A resource-allocating network for function interpolation
Neural Computation
A function estimation approach to sequential learning with neural networks
Neural Computation
Neural Networks: Tricks of the Trade, this book is an outgrowth of a 1996 NIPS workshop
Rapid and brief communication: Evolutionary extreme learning machine
Pattern Recognition
An efficient sequential learning algorithm for growing and pruning RBF (GAP-RBF) networks
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
IEEE Transactions on Information Theory
IEEE Transactions on Neural Networks
A generalized growing and pruning RBF (GGAP-RBF) neural network for function approximation
IEEE Transactions on Neural Networks
A Fast and Accurate Online Sequential Learning Algorithm for Feedforward Networks
IEEE Transactions on Neural Networks
A novel ELM based adaptive Kalman filter tracking algorithm
Neurocomputing
Hi-index | 0.00 |
Extreme learning machine (ELM) is one of the effective training algorithms for single hidden layer feedforward neural networks (SLFNs), but it often requires a large number of hidden units which makes the trained networks respond slowly to input patterns. Regularized least-squares extreme learning machine (RLS-ELM) is one of the improvements which can overcome this problem. It determines the input weights including hidden layer biases based on the regularized least squares scheme and the output weights based on the pseudo-inverse operation of hidden layer output matrix. In this paper, we develop the RLS-ELM for online sequential learning to due with large training datasets. It can learn the arriving data with one-by-one and chunk-by-chunk, blocks with different sizes. Experimental results show that the proposed approach can obtain good performance with compact network which results in high speed for both training and testing.