Exploiting process integration and composition in the context ofactive vision

  • Authors:
  • J. A. Fayman;P. Pirjanian;H. I. Christensen;E. Rivlin

  • Affiliations:
  • Dept. of Comput. Sci., Technion-Israel Inst. of Technol., Haifa;-;-;-

  • Venue:
  • IEEE Transactions on Systems, Man, and Cybernetics, Part C: Applications and Reviews
  • Year:
  • 1999

Quantified Score

Hi-index 0.00

Visualization

Abstract

The visual robustness of biological systems is in part due to their ability to actively integrate (fuse) information from a number of visual cues. In addition to active integration, the perception-action nature of biological vision demands event-driven behavioral composition. Providing mechanical vision systems with similar capabilities therefore requires tools and techniques for cue integration and behavioral composition. In this paper, we address two issues. First, we present a unified approach for handling both active integration and behavioral composition. The approach combines a theoretical framework that handles uncertainty using a voting scheme with a set of behaviors that are committed to achieving a specific goal through common effort and a well-known process composition model. Secondly, we address the issue of integration in the active vision activity of smooth pursuit. We have experimented with the fusion of four smooth pursuit techniques (blob tracking, edge tracking, template matching and image differencing). We discuss each technique, highlighting their strengths and weaknesses, and then show that fusing the techniques according to our formal framework improves system tracking behavior