Pattern recognition using neural networks: theory and algorithms for engineers and scientists
Pattern recognition using neural networks: theory and algorithms for engineers and scientists
Letters: Convex incremental extreme learning machine
Neurocomputing
Error minimized extreme learning machine with growth of hidden nodes and incremental learning
IEEE Transactions on Neural Networks
OP-ELM: optimally pruned extreme learning machine
IEEE Transactions on Neural Networks
Two-stage extreme learning machine for regression
Neurocomputing
Multiresponse sparse regression with application to multidimensional scaling
ICANN'05 Proceedings of the 15th international conference on Artificial neural networks: formal models and their applications - Volume Part II
Regularized online sequential learning algorithm for single-hidden layer feedforward neural networks
Pattern Recognition Letters
Universal approximation using incremental constructive feedforward networks with random hidden nodes
IEEE Transactions on Neural Networks
A Fast and Accurate Online Sequential Learning Algorithm for Feedforward Networks
IEEE Transactions on Neural Networks
Hi-index | 0.01 |
A learning scheme based on Extreme Learning Machine (ELM) and L"1"/"2 regularization is proposed for a double parallel feedforward neural network. ELM has been widely used as a fast learning method for feedforward networks with a single hidden layer. A key problem for ELM is the choice of the (minimum) number of the hidden nodes. To resolve this problem, we propose to combine the L"1"/"2 regularization method, that becomes popular in recent years in informatics, with ELM. It is shown in our experiments that the involvement of the L"1"/"2 regularizer in DPFNN with ELM results in less hidden nodes but equally good performance.