Loss functions to combine learning and decision in multiclass problems

  • Authors:
  • Alicia Guerrero-Curieses;Rocío Alaiz-Rodríguez;Jesús Cid-Sueiro

  • Affiliations:
  • Departamental I (Campus de Fuenlabrada), Universidad Rey Juan Carlos (of Madrid), Camino del Molino s/n, 28943, Fuenlabrada, (Madrid), Spain;Department of Electrical and Electronic Engineering, Universidad de León, 24071, León, Spain;Department of Signal Processing and Communications, EPS, Universidad Carlos III de Madrid, Avda. Universidad 30, 28911, Leganés, (Madrid), Spain

  • Venue:
  • Neurocomputing
  • Year:
  • 2005

Quantified Score

Hi-index 0.01

Visualization

Abstract

The design of structures and algorithms for non-MAP multiclass decision problems is discussed in this paper. We propose a parametric family of loss functions that provides accurate estimates for the posterior class probabilities near the decision regions. Moreover, we discuss learning algorithms based on the stochastic gradient minimization of these loss functions. We show that these algorithms behave like sample selectors: samples near the decision regions are the most relevant during learning. Moreover, it is shown that these loss functions can be seen as an alternative to support vector machines (SVM) classifiers for low-dimensional feature spaces. Experimental results on some real data sets are also provided to show the effectiveness of this approach versus the classical cross entropy (based on a global posterior probability estimation).