A Maximum Entropy Framework for Part-Based Texture and Object Recognition

  • Authors:
  • Svetlana Lazebnik;Cordelia Schmid;Jean Ponce

  • Affiliations:
  • University of Illinois at Urbana-Champaign;INRIA Rhône-Alpes;University of Illinois at Urbana-Champaign

  • Venue:
  • ICCV '05 Proceedings of the Tenth IEEE International Conference on Computer Vision (ICCV'05) Volume 1 - Volume 01
  • Year:
  • 2005

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper presents a probabilistic part-based approach for texture and object recognition. Textures are represented using a part dictionary found by quantizing the appearance of scale- or affine-invariant keypoints. Object classes are represented using a dictionary of composite semi-local parts, or groups of neighboring keypoints with stable and distinctive appearance and geometric layout. A discriminative maximum entropy framework is used to learn the posterior distribution of the class label given the occurrences of parts from the dictionary in the training set. Experiments on two texture and two object databases demonstrate the effectiveness of this framework for visual classification.