Efficient computations for large least square support vector machine classifiers
Pattern Recognition Letters
Benchmarking Least Squares Support Vector Machine Classifiers
Machine Learning
A Support Vector Machine with a Hybrid Kernel and Minimal Vapnik-Chervonenkis Dimension
IEEE Transactions on Knowledge and Data Engineering
Variations of the two-spiral task
Connection Science
IJCNN'09 Proceedings of the 2009 international joint conference on Neural Networks
Fast bootstrap methodology for regression model selection
Neurocomputing
An Iterative Method for Deciding SVM and Single Layer Neural Network Structures
Neural Processing Letters
A computational intelligence scheme for the prediction of the daily peak load
Applied Soft Computing
Direct and recursive prediction of time series using mutual information selection
IWANN'05 Proceedings of the 8th international conference on Artificial Neural Networks: computational Intelligence and Bioinspired Systems
Hi-index | 0.00 |
In this paper we describe a training method for one hidden layer multilayer perceptron classifier which is based on the idea of support vector machines (SVM). An upper bound on the Vapnik-Chervonenkis (VC) dimension is iteratively minimized over the interconnection matrix of the hidden layer and its bias vector. The output weights are determined according to the support vector method, but without making use of the classifier form which is related to Mercer's condition. The method is illustrated on a two-spiral classification problem