Bayesian regularization and pruning using a Laplace prior
Neural Computation
Support vector machines, reproducing kernel Hilbert spaces, and randomized GACV
Advances in kernel methods
Fast training of support vector machines using sequential minimal optimization
Advances in kernel methods
Optimization by Vector Space Methods
Optimization by Vector Space Methods
Solving a Class of Linearly Constrained Indefinite QuadraticProblems by D.C. Algorithms
Journal of Global Optimization
On Covering Methods for D.C. Optimization
Journal of Global Optimization
The Case against Accuracy Estimation for Comparing Induction Algorithms
ICML '98 Proceedings of the Fifteenth International Conference on Machine Learning
Local estimation of posterior class probabilities to minimize classification errors
IEEE Transactions on Neural Networks
Direct search as unsupervised training algorithm for neural networks
ICS'10 Proceedings of the 14th WSEAS international conference on Systems: part of the 14th WSEAS CSCC multiconference - Volume II
Hi-index | 0.00 |
The output of a classifier is usually determined by the value of a discriminant function and a decision is made based on this output which does not necessarily represent the posterior probability for the soft decision of classification. In this context, it is desirable that the output of a classifier be calibrated in such a way to give the meaning of the posterior probability of class membership. This paper presents a new method of postprocessing for the probabilistic scaling of classifier's output. For this purpose, the output of a classifier is analyzed and the distribution of the output is described by the beta distribution parameters. For more accurate approximation of class output distribution, the beta distribution parameters as well as the kernel parameters describing the discriminant function are adjusted in such a way to improve the uniformity of beta cumulative distribution function (CDF) values for the given class output samples. As a result, the classifier with the proposed scaling method referred to as the class probability output network (CPON) can provide accurate posterior probabilities for the soft decision of classification. To show the effectiveness of the proposed method, the simulation for pattern classification using the support vector machine (SVM) classifiers is performed for the University of California at Irvine (UCI) data sets. The simulation results using the SVM classifiers with the proposed CPON demonstrated a statistically meaningful performance improvement over the SVM and SVM-related classifiers, and also other probabilistic scaling methods.