Model-based registration for motion compensation during EP ablation procedures
WBIR'10 Proceedings of the 4th international conference on Biomedical image registration
MICCAI'10 Proceedings of the 13th international conference on Medical image computing and computer-assisted intervention: Part I
Combined cardiac and respiratory motion compensation for atrial fibrillation ablation procedures
MICCAI'11 Proceedings of the 14th international conference on Medical image computing and computer-assisted intervention - Volume Part I
Learning-based hypothesis fusion for robust catheter tracking in 2D X-ray fluoroscopy
CVPR '11 Proceedings of the 2011 IEEE Conference on Computer Vision and Pattern Recognition
Hi-index | 0.00 |
Catheter ablation is widely accepted as the best remaining option for the treatment of atrial fibrillation if drug therapy fails. Ablation procedures can be guided by 3-D overlay images projected onto live fluoroscopic X-ray images. These overlay images are generated from either MR, CT or C-Arm CT volumes. As the alignment of the overlay is often compromised by cardiac and respiratory motion, motion compensation methods are desirable. The most recent and promising approaches use either a catheter in the coronary sinus vein, or a circumferential mapping catheter placed at the ostium of one of the pulmonary veins. As both methods suffer from different problems, we propose a novel method to achieve motion compensation for fluoroscopy guided cardiac ablation procedures. Our new method localizes the coronary sinus catheter. Based on this information, we estimate the position of the circumferential mapping catheter. As the mapping catheter is placed at the site of ablation, it provides a good surrogate for respiratory and cardiac motion. To correlate the motion of both catheters, our method includes a training phase in which both catheters are tracked together. The training information is then used to estimate the cardiac and respiratory motion of the left atrium by observing the coronary sinus catheter only. The approach yields an average 2-D estimation error of 1.99 ± 1.20 mm.