Multi-Aspect Detection of Articulated Objects

  • Authors:
  • Edgar Seemann;Bastian Leibe;Bernt Schiele

  • Affiliations:
  • Darmstadt University of Technology;Darmstadt University of Technology;Darmstadt University of Technology

  • Venue:
  • CVPR '06 Proceedings of the 2006 IEEE Computer Society Conference on Computer Vision and Pattern Recognition - Volume 2
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

A wide range of methods have been proposed to detect and recognize objects. However, effective and efficient multiviewpoint detection of objects is still in its infancy, since most current approaches can only handle single viewpoints or aspects. This paper proposes a general approach for multiaspect detection of objects. As the running example for detection we use pedestrians, which add another difficulty to the problem, namely human body articulations. Global appearance changes caused by different articulations and viewpoints of pedestrians are handled in a unified manner by a generalization of the Implicit Shape Model [5]. An important property of this new approach is to share local appearance across different articulations and viewpoints, therefore requiring relatively few training samples. The effectiveness of the approach is shown and compared to previous approaches on two datasets containing pedestrians with different articulations and from multiple viewpoints.