First order optic flow from log-polar sampled images
ECCV '94 Proceedings of the third European conference on Computer vision (vol. 1)
Motion Tracking with an Active Camera
IEEE Transactions on Pattern Analysis and Machine Intelligence
Digital Image Processing Using MATLAB
Digital Image Processing Using MATLAB
Stereo-Based Ego-Motion Estimation Using Pixel Tracking and Iterative Closest Point
ICVS '06 Proceedings of the Fourth IEEE International Conference on Computer Vision Systems
SIFT Features Tracking for Video Stabilization
ICIAP '07 Proceedings of the 14th International Conference on Image Analysis and Processing
Real-time monocular visual odometry for on-road vehicles with 1-point RANSAC
ICRA'09 Proceedings of the 2009 IEEE international conference on Robotics and Automation
FPGA vs. multi-core CPUs vs. GPUs: hands-on experience with a sorting application
Facing the multicore-challenge
Hi-index | 0.00 |
In a rescue operation walking robots offer a great deal of flexibility in traversing uneven terrain in an uncontrolled environment. For such a rescue robot, each motion is a potential vital sign and the robot should be sensitive enough to detect such motion, at the same time maintaining high accuracy to avoid false alarms. However, the existing techniques for motion detection have severe limitations in dealing with strong levels of ego-motion on walking robots. This paper proposes an optical flow-based method for the detection of moving objects using a single camera mounted on a hexapod robot. The proposed algorithm estimates and compensates ego-motion to allow for object detection from a continuously moving robot, using a first-order-flow motion model. Our algorithm can deal with strong rotation and translation in 3D, with four degrees of freedom. Two alternative object detection methods using a 2D-histogram based vector clustering and motion-compensated frame differencing, respectively, are examined for the detection of slow- and fast-moving objects. The FPGA implementation with optimized resource utilization using SW/HW codesign can process video frames in real-time at 31 fps. The new algorithm offers a significant improvement in performance over the state-of-the-art, under harsh environment and performs equally well under smooth motion.