Motion stereo using ego-motion complex logarithmic mapping
IEEE Transactions on Pattern Analysis and Machine Intelligence
Active Tracking Strategy for Monocular Depth Inference over Multiple Frames
IEEE Transactions on Pattern Analysis and Machine Intelligence
Performance of optical flow techniques
International Journal of Computer Vision
First order optic flow from log-polar sampled images
ECCV '94 Proceedings of the third European conference on Computer vision (vol. 1)
The computation of optical flow
ACM Computing Surveys (CSUR)
Computer Vision and Image Understanding
IEEE Transactions on Pattern Analysis and Machine Intelligence
An Optimization Approach for Translational Motion Estimation in Log-Polar Domain
CAIP '01 Proceedings of the 9th International Conference on Computer Analysis of Images and Patterns
Active stereo tracking of N ≤ 3 targets using line scan cameras
IEEE Transactions on Robotics
A Novel Space Variant Image Representation
Journal of Mathematical Imaging and Vision
Hi-index | 0.00 |
Docking is a fundamental requirement for a mobile robot in order to be able to interact with objects in its environment. In this paper we present an algorithm and implementation for a special case of the docking problem for ground-based robots. We require the robot to dock with a fixated environment point where only visual information is available. Specifically, camera pan/tilt information is unknown, as is the direction of motion with respect to the object and the robot's velocity. Further, camera calibration is unavailable. The aim is to minimise the difference between the camera optical axis and the robot heading direction. This constitutes a behaviour for controlling robot direction based on fixation. This paper presents a full mathematical derivation of the method and implementation used. In its most general form, the method requires partial segmentation of the optical flow field. The experiments presented, however, assume partial knowledge as to whether points are closer to the camera than the fixation point or further away. There are many scenarios in robotic navigation where such assumptions are typical working conditions. We examine two cases: convex objects; and distant background/floor. The solution presented uses only the rotational component of optical flow from a log-polar sensor. Results are presented with real image and ray-traced image sequences. The robot is controlled based on a single component of optical flow over a small portion of the image, and thus is suited to real-time implementation.