A hierarchical Bayesian framework for multimodal active perception

  • Authors:
  • João Filipe Ferreira;Miguel Castelo-Branco;Jorge Dias

  • Affiliations:
  • Institute of Systems and Robotics, FCT-University of Coimbra, Coimbra, Portugal;Biomedical Institute of Research on Light and Image, Faculty of Medicine, University of Coimbra, Coimbra, Portugal;Institute of Systems and Robotics, FCT-University of Coimbra, Coimbra, Portugal

  • Venue:
  • Adaptive Behavior - Animals, Animats, Software Agents, Robots, Adaptive Systems
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this article, we present a hierarchical Bayesian framework for multimodal active perception, devised to be emergent, scalable and adaptive. This framework, while not strictly neuromimetic, finds its roots in the role of the dorsal perceptual pathway of the human brain. Its composing models build upon a common spatial configuration that is naturally fitting for the integration of readings from multiple sensors using a Bayesian approach devised in previous work. The framework presented in this article is shown to adequately model human-like active perception behaviours, namely by exhibiting the following desirable properties: high-level behaviour results from low-level interaction of simpler building blocks; seamless integration of additional inputs is allowed by the Bayesian Programming formalism; initial 'genetic imprint' of distribution parameters may be changed 'on the fly' through parameter manipulation, thus allowing for the implementation of goal-dependent behaviours (i.e. top-down influences).