Automatic classification of digestive organs in wireless capsule endoscopy videos

  • Authors:
  • Jeongkyu Lee;JungHwan Oh;Subodh Kumar Shah;Xiaohui Yuan;Shou Jiang Tang

  • Affiliations:
  • University of Bridgeport, Bridgeport, CT;University of North Texas, Denton, TX;University of Bridgeport, Bridgeport, CT;University of North Texas, Denton, TX;UTSW Medical Center, Dallas, TX

  • Venue:
  • Proceedings of the 2007 ACM symposium on Applied computing
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

Wireless Capsule Endoscopy (WCE) allows a physician to examine the entire small intestine without any surgical operation. With the miniaturization of wireless and camera technologies the ability comes to view the entire gestational track with little effort. Although WCE is a technical break-through that allows us to access the entire intestine without surgery, it is reported that a medical clinician spends one or two hours to assess a WCE video, It limits the number of examinations possible, and incur considerable amount of costs. To reduce the assessment time, it is critical to develop a technique to automatically discriminate digestive organs such as esophagus, stomach, small intestinal (i.e., duodenum, jejunum, and ileum) and colon. In this paper, we propose a novel technique to segment a WCE video into these anatomic parts based on color change pattern analysis. The basic idea is that the each digestive organ has different patterns of intestinal contractions that are quantified as the features. We present the experimental results that demonstrate the effectiveness of the proposed method.