Principles of multivariate analysis: a user's perspective
Principles of multivariate analysis: a user's perspective
Map learning with uninterpreted sensors and effectors
Artificial Intelligence
International Journal of Computer Vision - Special issue on statistical and computational theories of vision: modeling, learning, sampling and computing, Part I
Multiple view geometry in computer vision
Multiple view geometry in computer vision
Estimation of entropy and mutual information
Neural Computation
Towards Complete Generic Camera Calibration
CVPR '05 Proceedings of the 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05) - Volume 1 - Volume 01
Non-Parametric Self-Calibration
ICCV '05 Proceedings of the Tenth IEEE International Conference on Computer Vision (ICCV'05) Volume 1 - Volume 01
Are two rotational flows sufficient to calibrate a smooth non-parametric sensor?
CVPR '06 Proceedings of the 2006 IEEE Computer Society Conference on Computer Vision and Pattern Recognition - Volume 1
A Triangulation Method for the Sequential Mapping of Points from N-Space to Two-Space
IEEE Transactions on Computers
A Nonlinear Mapping for Data Structure Analysis
IEEE Transactions on Computers
Calibration from Statistical Properties of the Visual World
ECCV '08 Proceedings of the 10th European Conference on Computer Vision: Part IV
Evaluation of Projection Algorithms
IEEE Transactions on Pattern Analysis and Machine Intelligence
Camera Models and Fundamental Concepts Used in Geometric Computer Vision
Foundations and Trends® in Computer Graphics and Vision
International Journal of Computer Vision
On the global self-calibration of central cameras using two infinitesimal rotations
ACCV'12 Proceedings of the 11th Asian conference on Computer Vision - Volume Part IV
Hi-index | 0.00 |
We consider the problem of estimating the relative orientation of a number of individual photocells - or pixels - that hold fixed relative positions. The photocells measure the intensity of light traveling on a pencil of lines. We assume that the light-field thus sampled is changing, e.g. as the result of motion of the sensors and use the obtained measurements to estimate the orientations of the photocells. Our approach is based on correlation and information-theory dissimilarity measures. Experiments with real-world data show that the dissimilarity measures are strongly related to the angular separation between the photocells, and the relation can be modeled quantitatively. In particular we show that this model allows to estimate the angular separation from the dissimilarity. Although the resulting estimators are not very accurate, they maintain their performance throughout different visual environments, suggesting that the model encodes a very general property of our visual world. Finally, leveraging this method to estimate angles from signal pairs, we show how distance geometry techniques allow to recover the complete sensor geometry.