Fully automatic and segmentation-robust classification of breast tumors based on local texture analysis of ultrasound images

  • Authors:
  • Bo Liu;H. D. Cheng;Jianhua Huang;Jiawei Tian;Xianglong Tang;Jiafeng Liu

  • Affiliations:
  • School of Computer Science and Technology, Harbin Institute of Technology, Harbin, No. 92, Xidazhi Street, Harbin 150001, PR China;School of Computer Science and Technology, Harbin Institute of Technology, Harbin, No. 92, Xidazhi Street, Harbin 150001, PR China and Department of Computer Science, Utah State University, Logan, ...;School of Computer Science and Technology, Harbin Institute of Technology, Harbin, No. 92, Xidazhi Street, Harbin 150001, PR China;Second Affiliated Hospital of Harbin Medical University, Harbin, PR China;School of Computer Science and Technology, Harbin Institute of Technology, Harbin, No. 92, Xidazhi Street, Harbin 150001, PR China;School of Computer Science and Technology, Harbin Institute of Technology, Harbin, No. 92, Xidazhi Street, Harbin 150001, PR China

  • Venue:
  • Pattern Recognition
  • Year:
  • 2010

Quantified Score

Hi-index 0.01

Visualization

Abstract

Region of interest (ROI) is a region used to extract features. In breast ultrasound (BUS) image, the ROI is a breast tumor region. Because of poor image quality (low SNR (signal/noise ratio), low contrast, blurry boundaries, etc.), it is difficult to segment the BUS image accurately and produce a ROI which precisely covers the tumor region. Due to the requirement of accurate ROI for feature extraction, fully automatic classification of BUS images becomes a difficult task. In this paper, a novel fully automatic classification method for BUS images is proposed which can be divided into two steps: ''ROI generation step'' and ''ROI classification step''. The ROI generation step focuses on finding a credible ROI instead of finding the precise tumor location. The ROI classification step employs a novel feature extraction and classification strategy. First, some points in the ROI are selected as the ''classification checkpoints'' which are evenly distributed in the ROI, and the local texture features around each classification checkpoint are extracted. For each ROI, all the classification checkpoints are classified. Finally, the class of the BUS image is determined by analyzing every classification checkpoint in the corresponding ROI. Both steps were implemented by utilizing a supervised texture classification approach. The experiments demonstrate that the proposed method is very robust to the segmentation of BUS images, and very effective and useful for classifying breast tumors.