On the capabilities of multilayer perceptrons
Journal of Complexity - Special Issue on Neural Computation
The nature of statistical learning theory
The nature of statistical learning theory
Matrix computations (3rd ed.)
Least Squares Support Vector Machine Classifiers
Neural Processing Letters
Proximal support vector machine classifiers
Proceedings of the seventh ACM SIGKDD international conference on Knowledge discovery and data mining
Estimation of Dependences Based on Empirical Data: Springer Series in Statistics (Springer Series in Statistics)
Multisurface Proximal Support Vector Machine Classification via Generalized Eigenvalues
IEEE Transactions on Pattern Analysis and Machine Intelligence
IEEE Transactions on Neural Networks
Learning capability and storage capacity of two-hidden-layer feedforward networks
IEEE Transactions on Neural Networks
Boosting weighted ELM for imbalanced learning
Neurocomputing
Clustering in extreme learning machine feature space
Neurocomputing
Hi-index | 0.00 |
Instead of previous SVM algorithms that utilize a kernel to evaluate the dot products of data points in a feature space, here points are explicitly mapped into a feature space by a Single hidden Layer Feedforward Network (SLFN) with its input weights randomly generated. In theory this formulation, which can be interpreted as a special form of Regularization Network (RN), tends to provide better generalization performance than the algorithm for SLFNs--Extreme Learning Machine (ELM) and leads to a extremely simple and fast nonlinear SVM algorithm that requires only the inversion of a potentially small matrix with the order independent of the size of the training dataset. The experimental results show that the proposed Extreme SVM can produce better generalization performance than ELM almost all of the time and can run much faster than other nonlinear SVM algorithms with comparable accuracy.