The nature of statistical learning theory
The nature of statistical learning theory
Sparse Greedy Matrix Approximation for Machine Learning
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
Rule Extraction from a Multi Layer Perceptron with Staircase Activation Functions
IJCNN '00 Proceedings of the IEEE-INNS-ENNS International Joint Conference on Neural Networks (IJCNN'00)-Volume 3 - Volume 3
Convergence of the IRWLS Procedure to the Support Vector Machine Solution
Neural Computation
Structural simplification of a feed-forward, multilayer perceptron artificial neural network
ICASSP '91 Proceedings of the Acoustics, Speech, and Signal Processing, 1991. ICASSP-91., 1991 International Conference
IEEE Transactions on Neural Networks
Extraction of rules from artificial neural networks for nonlinear regression
IEEE Transactions on Neural Networks
An Iterative Method for Deciding SVM and Single Layer Neural Network Structures
Neural Processing Letters
Fuzzy ARTMAP and hybrid evolutionary programming for pattern classification
Journal of Intelligent & Fuzzy Systems: Applications in Engineering and Technology - Evolutionary neural networks for practical applications
Hi-index | 0.02 |
Due to their excellent performance, support vector machines (SVMs) are now used extensively in pattern classification applications. In this paper we show that the standard sigmoidal kernel definition lacks the capability to represent the family of perceptrons, and we propose an improved SVM with a sigmoidal kernel called support vector perceptron (SVP). We show by means of both synthetic and real world data sets that the proposed SVP is able to provide very accurate results in many classification problems, providing maximal margin solutions when classes are separable, and also producing very compact architectures comparable to classical multilayer perceptrons.