Data Fusion with Entropic Priors

  • Authors:
  • Francesco Palmieri;Domenico Ciuonzo

  • Affiliations:
  • Dipartimento di Ingegneria dell'Informazione, Seconda Università di Napoli, Italy;Dipartimento di Ingegneria dell'Informazione, Seconda Università di Napoli, Italy

  • Venue:
  • Proceedings of the 2011 conference on Neural Nets WIRN10: Proceedings of the 20th Italian Workshop on Neural Nets
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

In classification problems, lack of knowledge of the prior distribution may make the application of Bayes' rule inadequate. Uniform or arbitrary priors may often provide classification answers that, even in simple examples, may end up contradicting our common sense about the problem. Entropic priors, determined via the application of the maximum entropy principle, seem to provide a much better answer and can be easily derived and applied to classification tasks when no more than the likelihood functions are available. In this paper we present an example in which the use of the entropic priors is compared to the results of the application of Dempster-Shafer theory.