Discovering Useful Concept Prototypes for Classification Based on Filtering and Abstraction
IEEE Transactions on Pattern Analysis and Machine Intelligence
Benchmarking a Reduced Multivariate Polynomial Pattern Classifier
IEEE Transactions on Pattern Analysis and Machine Intelligence
In Defense of One-Vs-All Classification
The Journal of Machine Learning Research
Convergence of the IRWLS Procedure to the Support Vector Machine Solution
Neural Computation
Statistical Comparisons of Classifiers over Multiple Data Sets
The Journal of Machine Learning Research
Learning polynomial feedforward neural networks by genetic programming and backpropagation
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
Recently a class of reduced multivariate polynomial models (RM) has been proposed that performs well in classification tasks involving few features and many training data. The RM method, however, adopts a ridge least-square estimator, overlooking the fact that least square error usually does not correspond to minimum classification error. In this paper, we propose an iteratively reweighted regression method and two novel weight functions for fitting the RM model (IRF-RM). The IRF-RM method iteratively increases the weights of samples prone to misclassification and decreases the weights of samples far from the decision boundary, making the IRF-RM model more suitable for efficient pattern classification. A number of benchmark data sets are used to evaluate the IRF-RM method. Experimental results indicate that IRF-RM achieves a higher or comparable classification accuracy compared with RM and several state-of-the-art classification approaches.