Tracking Human Motion in Structured Environments Using a Distributed-Camera System
IEEE Transactions on Pattern Analysis and Machine Intelligence
Multiple view geometry in computer vision
Multiple view geometry in computer vision
Human Tracking Using Distributed Vision Systems
ICPR '98 Proceedings of the 14th International Conference on Pattern Recognition-Volume 1 - Volume 1
Distributed vision system: a perceptual information infrastructure for robot navigation
IJCAI'97 Proceedings of the 15th international joint conference on Artifical intelligence - Volume 1
Self-calibration of a vision-based sensor network
Image and Vision Computing
Performance analysis for gait in camera networks
AREA '08 Proceedings of the 1st ACM workshop on Analysis and retrieval of events/actions and workflows in video streams
Continuous learning of a multilayered network topology in a video camera network
Journal on Image and Video Processing - Special issue on video-based modeling, analysis, and recognition of human motion
Performance analysis for automated gait extraction and recognition in multi-camera surveillance
Multimedia Tools and Applications
Camera localization in distributed networks using trajectory estimation
Journal of Electrical and Computer Engineering
Hi-index | 0.00 |
This paper investigates one problem arising from ubiquitous sensing: can the position of a set of randomly placed sensors be automatically determined even if they do not have an overlapping field of view. (If the view overlapped, then standard stereo autocalibration can be used.) This paper shows that the problem is solveable. Distant moving features allow accurate orientation registration. Given the sensor orientations, nearby linearly moving features allow full pose registration, up to a scale factor.