Neural networks and the bias/variance dilemma
Neural Computation
Machine Learning
Letters: Convex incremental extreme learning machine
Neurocomputing
OP-ELM: optimally pruned extreme learning machine
IEEE Transactions on Neural Networks
LIBSVM: A library for support vector machines
ACM Transactions on Intelligent Systems and Technology (TIST)
Universal approximation using incremental constructive feedforward networks with random hidden nodes
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
Basic extreme learning machines apply least square solution to calculate the neural network's output weights. In the presence of outliers and multi-collinearity, the least square solution becomes invalid. In order to fix this problem, a new kind of extreme learning machine is proposed. An outlier detection technique is introduced to locate outliers and avoid their interference. The least square solution is replaced by regularization for output weights calculation during which the number of hidden nodes is also automatically chosen. Simulation results show that the proposed model has good prediction performance on both normal datasets and datasets contaminated by outliers.