Any reasonable cost function can be used for a posteriori probability approximation

  • Authors:
  • M. Saerens;P. Latinne;C. Decaestecker

  • Affiliations:
  • IRIDIA Lab., Univ. Libre de Bruxelles, Brussels;-;-

  • Venue:
  • IEEE Transactions on Neural Networks
  • Year:
  • 2002

Quantified Score

Hi-index 0.01

Visualization

Abstract

In this paper, we provide a straightforward proof of an important, but nevertheless little known, result obtained by Lindley in the framework of subjective probability theory. This result, once interpreted in the machine learning/pattern recognition context, puts new light on the probabilistic interpretation of the output of a trained classifier. A learning machine, or more generally a model, is usually trained by minimizing a criterion-the expectation of the cost function-measuring the discrepancy between the model output and the desired output. In this letter, we first show that, for the binary classification case, training the model with any "reasonable cost function" can lead to Bayesian a posteriori probability estimation. Indeed, after having trained the model by minimizing the criterion, there always exists a computable transformation that maps the output of the model to the Bayesian a posteriori probability of the class membership given the input. Then, necessary conditions allowing the computation of the transformation mapping the outputs of the model to the a posteriori probabilities are derived for the multioutput case. Finally, these theoretical results are illustrated through some simulation examples involving various cost functions.