An automated image-guided robot arm system for small animal tissue biopsy studies

  • Authors:
  • Y. H. Huang;T. H. Wu;M. H. Lin;C. C. Yang;T. C. Wang;C. L. Chen;W. Y. Guo;Jason J. S. Lee

  • Affiliations:
  • Institute of Radiological Sciences, National Yang-Ming University, Taipei, Taiwan;Department of Medical Imaging Technology, Chung Shan Medical University, Taichung, Taiwan;Institute of Radiological Sciences, National Yang-Ming University, Taipei, Taiwan;Institute of Radiological Sciences, National Yang-Ming University, Taipei, Taiwan;Institute of Radiological Sciences, National Yang-Ming University, Taipei, Taiwan;Institute of Radiological Sciences, National Yang-Ming University, Taipei, Taiwan;Institute of Radiological Sciences, National Yang-Ming University, Taipei, Taiwan and Department of Radiology, Taipei Veterans General Hospital, Taipei, Taiwan;Institute of Radiological Sciences, National Yang-Ming University, Taipei, Taiwan

  • Venue:
  • BioMed'06 Proceedings of the 24th IASTED international conference on Biomedical engineering
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

The ability to non-invasively monitor cell biology in vivo is one of the most important goals of molecular imaging. Imaging procedures could be inter-subject performed repeatedly at different investigating stages; thereby need not sacrifice small animals during the entire study period. Thus, the ultimate goal of this study was to design a stereotactic image-guided system for small animals and integrated it with an automatic robot arm for in vivo tissue biopsy analysis. The system was composed of three main parts, including one small animal stereotactic frame, one semi-automatic imaging-fusion software and an automatic robot arm system. The system has been thoroughly evaluated with three components; the robot position accuracy was 0.05±0.02 mm, the image registration accuracy was 0.37±0.18 mm and the system integration was satisfactorily within 1.20±0.39 mm of error. From these results, the system demonstrated sufficient accuracy to guide the micro-injector that mounted on the robot arm from the planned delivery routes into practice. The entire system accuracy was limited by the image fusion and orientation procedures, due to its nature of the blurred PET imaging obtained from the small objects. The primary improvement is to acquire as higher resolution as possible the fused imaging for localizing the targets, which will be our next goal for study.