The nature of statistical learning theory
The nature of statistical learning theory
Machine learning, neural and statistical classification
Machine learning, neural and statistical classification
Growing Gaussian mixtures network for classification applications
Signal Processing
Saturated Perceptrons for Maximum Margin and Minimum Misclassification Error
Neural Processing Letters
Learning Algorithms for Nonparametric Solution to the Minimum Error Classification Problem
IEEE Transactions on Computers
Neural Computation
Discriminative learning for minimum error classification [patternrecognition]
IEEE Transactions on Signal Processing
Cost functions to estimate a posteriori probabilities in multiclass problems
IEEE Transactions on Neural Networks
On the structure of strict sense Bayesian cost functions and its applications
IEEE Transactions on Neural Networks
The multilayer perceptron as an approximation to a Bayes optimal discriminant function
IEEE Transactions on Neural Networks
Cost-sensitive learning based on Bregman divergences
Machine Learning
Hi-index | 0.01 |
The design of structures and algorithms for non-MAP multiclass decision problems is discussed in this paper. We propose a parametric family of loss functions that provides accurate estimates for the posterior class probabilities near the decision regions. Moreover, we discuss learning algorithms based on the stochastic gradient minimization of these loss functions. We show that these algorithms behave like sample selectors: samples near the decision regions are the most relevant during learning. Moreover, it is shown that these loss functions can be seen as an alternative to support vector machines (SVM) classifiers for low-dimensional feature spaces. Experimental results on some real data sets are also provided to show the effectiveness of this approach versus the classical cross entropy (based on a global posterior probability estimation).