The nature of statistical learning theory
The nature of statistical learning theory
Machine Learning
Neural Networks: A Comprehensive Foundation
Neural Networks: A Comprehensive Foundation
Pattern Recognition and Neural Networks
Pattern Recognition and Neural Networks
Lagrangian support vector machines
The Journal of Machine Learning Research
An overview of statistical learning theory
IEEE Transactions on Neural Networks
An introduction to kernel-based learning algorithms
IEEE Transactions on Neural Networks
Ho-Kashyap classifier with early stopping for regularization
Pattern Recognition Letters
Matrix-pattern-oriented Ho-Kashyap classifier with regularization learning
Pattern Recognition
Multiple Kernel learning using regularized Ho-Kashyap classifier in empirical Kernel mapping space
ICNC'09 Proceedings of the 5th international conference on Natural computation
A novel multi-view learning developed from single-view patterns
Pattern Recognition
Random projection ensemble learning with multiple empirical kernels
Knowledge-Based Systems
Three-fold structured classifier design based on matrix pattern
Pattern Recognition
Hi-index | 0.10 |
This paper introduces a new classifier design method based on a modification of the classical Ho-Kashyap procedure. The proposed method uses an absolute and squared approximation of the misclassification error rate to design a linear classifier. Moreover, the ability to generalize can be easily controlled and robustness of outliers is obtained. The proposed method of a classifier design maximizes the separation margin similarly as the support vector machine. In this paper, nine public domain benchmark datasets are used to evaluate the performance of the modified Ho-Kashyap classifier. A comparison with the support vector machine, kernel Fisher discriminant, regularized AdaBoost and radial-basis neural network is made. Large-scale simulations demonstrate the competitiveness of the proposed method with respect to the state-of-the-art classifiers.