Texture guided active appearance model propagation for prostate segmentation

  • Authors:
  • Soumya Ghose;Arnau Oliver;Robert Martí;Xavier Lladó;Jordi Freixenet;Joan C. Vilanova;Fabrice Meriaudeau

  • Affiliations:
  • Computer Vision and Robotics Group, University of Girona, Girona, Spain and Laboratoire Le2I, UMR, CNRS, Université de Bourgogne, Le Creusot, France;Computer Vision and Robotics Group, University of Girona, Girona, Spain;Computer Vision and Robotics Group, University of Girona, Girona, Spain;Computer Vision and Robotics Group, University of Girona, Girona, Spain;Computer Vision and Robotics Group, University of Girona, Girona, Spain;Girona Magnetic Resonance Imaging Center, Girona, Spain;Laboratoire Le2I, UMR, CNRS, Université de Bourgogne, Le Creusot, France

  • Venue:
  • MICCAI'10 Proceedings of the 2010 international conference on Prostate cancer imaging: computer-aided diagnosis, prognosis, and intervention
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

Fusion of Magnetic Resonance Imaging (MRI) and Trans Rectal Ultra Sound (TRUS) images during TRUS guided prostate biopsy improves localization of the malignant tissues. Segmented prostate in TRUS and MRI improve registration accuracy and reduce computational cost of the procedure. However, accurate segmentation of the prostate in TRUS images can be a challenging task due to low signal to noise ratio, heterogeneous intensity distribution inside the prostate, and imaging artifacts like speckle noise and shadow. We propose to use texture features from approximation coefficients of Haar wavelet transform for propagation of a shape and appearance based statistical model to segment the prostate in a multi-resolution framework. A parametric model of the propagating contour is derived from Principal Component Analysis of prior shape and texture informations of the prostate from the training data. The parameters are then modified with prior knowledge of the optimization space to achieve optimal prostate segmentation. The proposed method achieves a mean Dice Similarity Coefficient value of 0.95 ± 0.01, and mean segmentation time of 0.72 ± 0.05 seconds when validated on 25 TRUS images, grabbed from video sequences, in a leave-one-out validation framework. Our proposed model performs computationally efficient accurate prostate segmentation in presence of intensity heterogeneity and imaging artifacts.