Wearable Visual Robots

  • Authors:
  • W. W. Mayol;B. J. Tordoff;D. W. Murray

  • Affiliations:
  • Department of Engineering Science, University of Oxford, Oxford, UK;Department of Engineering Science, University of Oxford, Oxford, UK;Department of Engineering Science, University of Oxford, Oxford, UK

  • Venue:
  • Personal and Ubiquitous Computing
  • Year:
  • 2002

Quantified Score

Hi-index 0.01

Visualization

Abstract

Research work reported in the literature in wearable visual computing has used exclusively static (or non-active) cameras, making the imagery and image measurements dependent on the wearer’s posture and motions. It is assumed that the camera is pointing in a good direction to view relevant parts of the scene at best by virtue of being mounted on the wearer’s head, or at worst wholly by chance. Even when pointing in roughly the correct direction, any visual processing relying on feature correspondence from a passive camera is made more difficult by the large, uncontrolled inter-image movements which occur when the wearer moves, or even breathes. This paper presents a wearable active visual sensor which is able to achieve a level of decoupling of camera movement from the wearer’s posture and motions by a combination of inertial and visual sensor feedback and active control. The issues of sensor placement, robot kinematics and their relation to wearability are discussed. The performance of the prototype robot is evaluated for some essential visual tasks. The paper also discusses potential applications for this kind of wearable robot.