Hopping odometry: motion estimation with selective vision

  • Authors:
  • Edmond Wai Yan So;Tetsuo Yoshimitsu;Takashi Kubota

  • Affiliations:
  • School of Physical Sciences, Department of Space and Astronautical Science, The Graduate University for Advanced Studies, Sagamihara-shi, Japan;Institute of Space and Astronautical Science, Japan Aerospace Exploration Agency, Sagamihara-shi, Japan;Institute of Space and Astronautical Science, Japan Aerospace Exploration Agency, Sagamihara-shi, Japan

  • Venue:
  • IROS'09 Proceedings of the 2009 IEEE/RSJ international conference on Intelligent robots and systems
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

We present a two-step iterative algorithm to estimate the trajectory of a hopping rover. In the first step, a monocular scheme of visual odometry is adapted to estimate an initial portion the hopping trajectory. From this, the parameters for the ballistic motion are recovered, and the trajectory is extrapolated to predict the positions of the rover for the remainder of the hop. In the second step, we devise a scheme called "selective vision", combining the ideas of active vision and guided search. An envelope lying between the start and end of a hop is defined, within which features most likely to be re-observed across a hop are detected and matched. Performing pose estimation on the these matched features allow the relative motion between a camera frame within the visual odometry step and a camera frame within the extrapolated trajectory to be estimated. The newly determined camera frame in the extrapolated trajectory can then be used to refine the parameters of the ballistic motion, and the trajectory can be re-extrapolated to predict future positions of the hopping rover. Following this scheme, it is possible to estimate the trajectory of a hopping rover undergoing continuous rotational motion with only one set of cameras without continuous tracking of terrain features.