OpenSurfaces: a richly annotated catalog of surface appearance

  • Authors:
  • Sean Bell;Paul Upchurch;Noah Snavely;Kavita Bala

  • Affiliations:
  • Cornell University;Cornell University;Cornell University;Cornell University

  • Venue:
  • ACM Transactions on Graphics (TOG) - SIGGRAPH 2013 Conference Proceedings
  • Year:
  • 2013

Quantified Score

Hi-index 0.00

Visualization

Abstract

The appearance of surfaces in real-world scenes is determined by the materials, textures, and context in which the surfaces appear. However, the datasets we have for visualizing and modeling rich surface appearance in context, in applications such as home remodeling, are quite limited. To help address this need, we present OpenSurfaces, a rich, labeled database consisting of thousands of examples of surfaces segmented from consumer photographs of interiors, and annotated with material parameters (reflectance, material names), texture information (surface normals, rectified textures), and contextual information (scene category, and object names). Retrieving usable surface information from uncalibrated Internet photo collections is challenging. We use human annotations and present a new methodology for segmenting and annotating materials in Internet photo collections suitable for crowdsourcing (e.g., through Amazon's Mechanical Turk). Because of the noise and variability inherent in Internet photos and novice annotators, designing this annotation engine was a key challenge; we present a multi-stage set of annotation tasks with quality checks and validation. We demonstrate the use of this database in proof-of-concept applications including surface retexturing and material and image browsing, and discuss future uses. OpenSurfaces is a public resource available at http://opensurfaces.cs.cornell.edu/.