Optimization of Divergences within the Exponential Family for Image Segmentation

  • Authors:
  • Francois Lecellier;Stephanie Jehan-Besson;Jalal Fadili;Gilles Aubert;Marinette Revenu

  • Affiliations:
  • Laboratoire GREYC, University of Caen, France;Laboratoire LIMOS, University of Clermont-Ferrand, France;Laboratoire GREYC, University of Caen, France;Laboratoire J.A. Dieudonné, University of Nice Sophia-Antipolis, France;Laboratoire GREYC, University of Caen, France

  • Venue:
  • SSVM '09 Proceedings of the Second International Conference on Scale Space and Variational Methods in Computer Vision
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this work, we propose novel results for the optimization of divergences within the framework of region-based active contours. We focus on parametric statistical models where the region descriptor is chosen as the probability density function (pdf) of an image feature (e.g. intensity) inside the region and the pdf belongs to the exponential family. The optimization of divergences appears as a flexible tool for segmentation with and without intensity prior. As far as segmentation without reference is concerned, we aim at maximizing the discrepancy between the pdf of the inside region and the pdf of the outside region. Moreover, since the optimization framework is performed within the exponential family, we can cope with difficult segmentation problems including various noise models (Gaussian, Rayleigh, Poisson, Bernoulli ...). We also experimentally show that the maximisation of the KL divergence offers interesting properties compare to some other data terms (e.g. minimization of the anti-log-likelihood). Experimental results on medical images (brain MRI, contrast echocardiography) confirm the applicability of this general setting.