Category sensitive codebook construction for object category recognition

  • Authors:
  • Chunjie Zhang;Jing Liu;Yi Ouyang;Qi Tian;Hanqing Lu;Songde Ma

  • Affiliations:
  • National Laboratory of Pattern Recognition, Institute of Automation, Chinese Academy of Sciences, Beijing, China;National Laboratory of Pattern Recognition, Institute of Automation, Chinese Academy of Sciences, Beijing, China;National Laboratory of Pattern Recognition, Institute of Automation, Chinese Academy of Sciences, Beijing, China;University of Texas at San Antonio, San Antonio, Texas;National Laboratory of Pattern Recognition, Institute of Automation, Chinese Academy of Sciences, Beijing, China;National Laboratory of Pattern Recognition, Institute of Automation, Chinese Academy of Sciences, Beijing, China

  • Venue:
  • ICIP'09 Proceedings of the 16th IEEE international conference on Image processing
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

Recently, the bag of visual words based image representation is getting popular in object category recognition. Since the codebook of the bag-of-words (BOW) based image representation approach is typically constructed by only measuring the visual similarity of local image features (e.g., k-means), the resulting codebooks may not capture the desired information for object category recognition. This paper proposes a novel optimization method for discriminative codebook construction that considers the category information of local image features as an additional term in traditional visual-similarity-only based codebook construction methods. The category sensitive codebook is constructed through solving an optimization problem. Therefore, the category sensitive codebook construction method goes one step beyond visual-similarity-only methods. Besides, the proposed category sensitive codebook construction method can be implemented with k-means clustering very efficiently and effectively. Experimental results on PASCAL VOC Challenge 2006 data set demonstrate the effectiveness of our method.