AfNet: the affordance network

  • Authors:
  • Karthik Mahesh Varadarajan;Markus Vincze

  • Affiliations:
  • TU Wien, Vienna, Austria;TU Wien, Vienna, Austria

  • Venue:
  • ACCV'12 Proceedings of the 11th Asian conference on Computer Vision - Volume Part I
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

There has been a growing need to build an object recognition system that can successfully characterize object constancy, irrespective of lighting, shading, occlusions, viewpoint variations and most importantly, deal with the multitude of shapes, colors and sizes in which objects are found. Affordances on the other hand, provide symbolic grounding mechanisms that enable linking features obtained from visual perception with the functionality of the objects, which provides the most consistent and holistic characterization of an object. Recognition by Component Affordances (RBCA) is a recent theory that builds affordance features for recognition. As an extension of the psychophysical theory of Recognition by Components (RBC) to generic visual perception, RBCA is well suited for cognitive visual processing systems which are required to perform implicit cognitive tasks. A common task is to substitute a cup for a mug, bottle, jug, pitcher, pilsner, beaker, chalice, goblet or any other unlabeled object, but with a physical part affording the ability to hold liquid and a part affording grasping by a human hand, given the goal of 'finding an empty cup' and no cups are available in the work environment of interest. In this paper, we present affordance features for recognition of objects. Using a set of 25 structural and 10 material affordances we define a database of over 250 common household objects. This database called the Affordance Network or AfNet is available as community development framework and is well suited for deployment on domestic robots. Sample object recognition results using AfNet and the associated inference engine that grounds the affordances through visual perception features demonstrate the effectiveness of the approach.