Towards an ultrasound probe with vision: structured light to determine surface orientation

  • Authors:
  • Samantha Horvath;John Galeotti;Bo Wang;Matt Perich;Jihang Wang;Mel Siegel;Patrick Vescovi;George Stetten

  • Affiliations:
  • Department of Bioengineering, University of Pittsburgh, USA,Robotics Institute, Carnegie Mellon University;Robotics Institute, Carnegie Mellon University;Department of Bioengineering, University of Pittsburgh;Department of Bioengineering, University of Pittsburgh;Department of Biomedical Engineering, Carnegie Mellon University;Robotics Institute, Carnegie Mellon University;Department of Bioengineering, University of Pittsburgh;Department of Bioengineering, University of Pittsburgh, USA,Robotics Institute, Carnegie Mellon University, USA,Department of Biomedical Engineering, Carnegie Mellon University

  • Venue:
  • AE-CAI'11 Proceedings of the 6th international conference on Augmented Environments for Computer-Assisted Interventions
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

Over the past decade, we have developed an augmented reality system called the Sonic Flashlight (SF), which merges ultrasound with the operator's vision using a half-silvered mirror and a miniature display attached to the ultrasound probe. We now add a small video camera and a structured laser light source so that computer vision algorithms can determine the location of the surface of the patient being scanned, to aid in analysis of the ultrasound data. In particular, we intend to determine the angle of the ultrasound probe relative to the surface to disambiguate Doppler information from arteries and veins running parallel to, and beneath, that surface. The initial demonstration presented here finds the orientation of a flat-surfaced ultrasound phantom. This is a first step towards integrating more sophisticated computer vision methods into automated ultrasound analysis, with the ultimate goal of creating a symbiotic human/machine system that shares both ultrasound and visual data.