Boundary Finding with Parametrically Deformable Models
IEEE Transactions on Pattern Analysis and Machine Intelligence
General Object Reconstruction Based on Simplex Meshes
International Journal of Computer Vision
Diffusion Snakes: Introducing Statistical Shape Knowledge into the Mumford-Shah Functional
International Journal of Computer Vision
Real-Time Elastic Deformations of Soft Tissues for Surgery Simulation
IEEE Transactions on Visualization and Computer Graphics
Finite-Element Methods for Active Contour Models and Balloons for 2-D and 3-D Images
IEEE Transactions on Pattern Analysis and Machine Intelligence
Coupled Parametric Active Contours
IEEE Transactions on Pattern Analysis and Machine Intelligence
IEEE Transactions on Image Processing
IEEE Transactions on Image Processing
Segmenting and tracking fluorescent cells in dynamic 3-D microscopy with coupled active surfaces
IEEE Transactions on Image Processing
3D automated nuclear morphometric analysis using active meshes
PRIB'07 Proceedings of the 2nd IAPR international conference on Pattern recognition in bioinformatics
Hi-index | 0.00 |
Deformable mesh methods have become an alternative of choice to classical deformable models for 3D image understanding. They allow to render the evolving surface directly during the segmentation process in a fast and efficient way, avoiding both the additional time-cost and approximation errors induced by 3D reconstruction algorithms after segmentation. Current methods utilize edge-based forces to attract the mesh surface toward the image entities. These forces are inadequate in 3D fluorescence microscopy, where edges are not well defined by gradient. In this paper, we propose a fully automated deformable 3D mesh model that deforms using the reduced Mumford-Shah functional to segment and track objects with fuzzy boundaries. Simultaneous rendering of the mesh evolution allows faster tweaking of the model parameters and offers biologists a more precise insight on the scene and hence better understanding of biological phenomena. We present evaluations on both synthetic and real 3D microscopy data.