Efficient Minimisation of the KL Distance for the Approximation of Posterior Conditional Probabilities

  • Authors:
  • M. Battisti;P. Burrascano;D. Pirollo

  • Affiliations:
  • Istituto di Elettronica, Università di Perugia, via G. Duranti 1/A-1, I 06143 Perugia, Italy E-mail: burrascano@istel.ing.unipg.it;Istituto di Elettronica, Università di Perugia, via G. Duranti 1/A-1, I 06143 Perugia, Italy E-mail: burrascano@istel.ing.unipg.it;Istituto di Elettronica, Università di Perugia, via G. Duranti 1/A-1, I 06143 Perugia, Italy E-mail: burrascano@istel.ing.unipg.it

  • Venue:
  • Neural Processing Letters
  • Year:
  • 1997

Quantified Score

Hi-index 0.01

Visualization

Abstract

The minimisation of a least mean squares cost functionproduces poorresults in the ranges of the input variable where the quantity to beapproximatedtakes on relatively low values. This can be a problem if an accurateapproximationis required in a wide dynamic range. The present paper approachesthis problemin the case of multilayer perceptrons trained to approximate theposteriorconditional probabilities in a multicategory classification problem.The use of acost function derived from the Kullback–Leibler information distancemeasure isproposed and a computationally light algorithm is derived for itsminimisation.The effectiveness of the procedure is experimentally verified.