A classification-driven similarity matching framework for retrieval of biomedical images

  • Authors:
  • Md Mahmudur Rahman;Sameer K. Antani;George R. Thoma

  • Affiliations:
  • National Institutes of Health, Bethesda, MD, USA;National Institutes of Health, Bethesda, MD, USA;National Institutes of Health, Bethesda, MD, USA

  • Venue:
  • Proceedings of the international conference on Multimedia information retrieval
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper presents a classification-driven biomedical image retrieval system to bride the semantic gap by transforming image features to their global categories at different granularity, such as image modality, body part, and orientation. To generate the feature vectors at different levels of abstraction, both the visual concept feature based on the "bag of concepts" model that comprise of local color and texture patches and various low-level global color, edge, and texture-related features are extracted. Since, it is difficult to find a unique feature to compare images effectively for all types of queries, we utilize a similarity fusion approach based on the linear combination of individual features. However, instead of using the commonly used fixed or hard weighting approach, we rely on the image classification to determine the importance of a feature at real time. For this, a supervised multi-class classifier based on the support vector machine (SVM) is trained on a set of sample images and classifier combination techniques based on the rules derived from the Bayes's theorem are explored. After the combined prediction of the classifiers for a query image category, the individual pre-computed weights of different features are adjusted in the similarity matching function for effective query-specific retrieval. Experiment is performed in a diverse medical image collection of 67,000 images of different modalities. It demonstrates the effectiveness of the category-specific similarity fusion approach with a mean average precision (MAP) score of 0.0265 when compared to using only a single feature or equal weighting of each feature in similarity matching.