Algorithm for analysing optical flow based on the least-squares method
Image and Vision Computing
Single Lens Stereo with a Plenoptic Camera
IEEE Transactions on Pattern Analysis and Machine Intelligence - Special issue on interpretation of 3-D scenes—part II
SIGGRAPH '96 Proceedings of the 23rd annual conference on Computer graphics and interactive techniques
SIGGRAPH '96 Proceedings of the 23rd annual conference on Computer graphics and interactive techniques
Omni-Directional Structure from Motion
OMNIVIS '00 Proceedings of the IEEE Workshop on Omnidirectional Vision
Vergence and Tracking Fusing Log-Polar Images
ICPR '96 Proceedings of the International Conference on Pattern Recognition (ICPR '96) Volume IV-Volume 7472 - Volume 7472
Ego-Motion and Omnidirectional Cameras
ICCV '98 Proceedings of the Sixth International Conference on Computer Vision
A hierarchy of cameras for 3D photography
Computer Vision and Image Understanding - Model-based and image-based 3D scene representation for interactive visalization
Lens Model Selection for a Markerless AR Tracking System
ISMAR '05 Proceedings of the 4th IEEE/ACM International Symposium on Mixed and Augmented Reality
An optical navigation sensor for micro aerial vehicles
Computer Vision and Image Understanding
Wide-angle Visual Feature Matching for Outdoor Localization
International Journal of Robotics Research
An analysis-by-synthesis camera tracking approach based on free-form surfaces
Proceedings of the 29th DAGM conference on Pattern recognition
Lens model selection for visual tracking
PR'05 Proceedings of the 27th DAGM conference on Pattern Recognition
Hi-index | 0.00 |
We investigate the relationship between camera design and the problemof recovering the motion and structure of a scene from video data.The visual information thatcould possiblybe obtainedis described by the plenoptic function. A camera can be viewed as adevice thatcapturesa subset of this function, that is, it measures some of thelight rays insome part of the space. The information contained in the subsetdetermineshow difficult it is to solve subsequent interpretation processes.By examining the differential structure of the time varying plenopticfunctionwe relate different known and new camera modelsto the spatio-temporal structure of the observed scene.This allows us to define a hierarchy of camera designs, where the orderisdetermined by the stability and complexity of the computations necessarytoestimatestructure and motion. At the low end of this hierarchyis the standard planar pinhole camera for which the structure frommotion problem isnon-linear and ill-posed. At the high end is a new camera, which wecall the full field of view polydioptric camera, for which theproblemis linearand stable. In between are multiple-view cameras with large fields ofview whichwe have built, as well as catadioptric panoramic sensors and otheromni-directional cameras.We develop design suggestions for the polydioptric camera, and based uponthis new design we propose alinearalgorithm for ego-motion estimation, which in essencecombinesdifferential motion estimationwith differential stereo.