Cell tracking and segmentation in electron microscopy images using graph cuts

  • Authors:
  • Huei-Fang Yang;Yoonsuck Choe

  • Affiliations:
  • Department of Computer Science and Engineering, Texas A&M University, College Station, TX;Department of Computer Science and Engineering, Texas A&M University, College Station, TX

  • Venue:
  • ISBI'09 Proceedings of the Sixth IEEE international conference on Symposium on Biomedical Imaging: From Nano to Macro
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

Understanding neural connectivity and structures in the brain requires detailed 3D anatomical models, and such an understanding is essential to the study of the nervous system. However, the reconstruction of 3D models from a large set of dense nanoscale medical images is very challenging, due to the imperfections in staining and noise in the imaging process. Manual segmentation in 2D followed by tracking the 2D contours through cross-sections to build 3D structures can be a solution, but it is impractical. In this paper, we propose an automated tracking and segmentation framework to extract 2D contours and to trace them through the z direction. The segmentation is posed as an energy minimization problem and solved via graph cuts. The energy function to be minimized contains a regional term and a boundary term. The regional term is defined over the flux of the gradient vector fields and the distance function. Our main idea is that the distance function should carry the information of the segmentation from the previous image based on the assumption that successive images have a similar segmentation. The boundary term is defined over the gray-scale intensity of the image. Experiments were conducted on nanoscale image sequences from the Serial Block Face Scanning Electron Microscope (SBF-SEM). The results show that our method can successfully track and segment densely packed cells in EM image stacks.